CSCC10 – Human-Computer Interaction: Core Vocabulary

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/87

flashcard set

Earn XP

Description and Tags

A comprehensive set of vocabulary flashcards capturing key terms and definitions from CSCC10 lectures on Human-Computer Interaction, including design processes, evaluation methods, usability principles, prototyping, and ethics.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

88 Terms

1
New cards

Human-Computer Interaction (HCI)

Interdisciplinary field studying how people interact with computers in order to design usable, effective and enjoyable systems.

2
New cards

Usability

Quality attribute that measures how easy a user interface is to learn, use, remember, error-free and satisfying.

3
New cards

Universal Usability

Design approach that aims to make interactive systems usable by the widest possible range of people regardless of ability, age, culture or technology.

4
New cards

Iterative Design

Cyclic process of prototyping, testing, analyzing and refining a product until it meets user needs.

5
New cards

Heuristic Evaluation

Expert review method where evaluators examine an interface against a set of recognized usability principles (heuristics).

6
New cards

Usability Testing

Empirical method that observes representative users performing tasks with a product to identify problems and measure performance.

7
New cards

Ethics (in HCI)

Standards and practices that protect participants’ rights, privacy and well-being during research and evaluation.

8
New cards

Institutional Review Board (IRB)

Authoritative body that reviews and approves research protocols involving human participants for ethical compliance.

9
New cards

Informed Consent Form

Document that explains study purpose, procedures, risks and rights, which participants sign voluntarily before taking part.

10
New cards

Controlled Setting

Laboratory or similar environment where evaluators tightly manage variables during user studies.

11
New cards

Natural Setting

Real-world environment (field or ‘in the wild’) where products are evaluated in authentic use contexts.

12
New cards

Living Lab

Instrumented real-life environment (e.g., Aware Home) used to study long-term technology use with minimal intrusion.

13
New cards

Usability Laboratory

Dedicated facility with a testing room for participants and an observation room for evaluators, often separated by a one-way mirror.

14
New cards

Data Logging

Automatic recording of user interactions (clicks, errors, timing) for later analysis.

15
New cards

Think-Aloud Technique

Method where users verbalize their thoughts while performing tasks, providing insight into reasoning and struggles.

16
New cards

A/B Testing

Experimental method that compares two interface versions with real users to determine which performs better.

17
New cards

Remote Usability Testing

Usability evaluation conducted with participants at a distance using screen-sharing or logging tools.

18
New cards

Field Study

Observation of users in their own environment to understand natural behaviors and context of use.

19
New cards

Eye-Tracking

Technology that records where and how long users look at interface elements to uncover attention patterns.

20
New cards

Analytics

Collection and interpretation of large-scale usage data (e.g., Google Analytics) to evaluate traffic and user behavior.

21
New cards

Reliability (in evaluation)

Degree to which a method produces consistent results across occasions.

22
New cards

Validity

Extent to which an evaluation method measures what it intends to measure.

23
New cards

Ecological Validity

How accurately evaluation results represent real-world usage conditions.

24
New cards

Bias

Systematic distortion in data collection or interpretation that misrepresents true user behavior.

25
New cards

Scope (of results)

Range over which evaluation findings can be generalized to other users, tasks or contexts.

26
New cards

Jakob Nielsen

Usability expert who formulated the 10 usability heuristics and popularized heuristic evaluation.

27
New cards

Nielsen’s 10 Heuristics

  1. Visibility of System Status: Keep users informed about what is going on through appropriate feedback.

  2. Match Between System and Real World: Speak the users' language, using familiar concepts and a natural order.

  3. User Control and Freedom: Provide clear 'emergency exits' and support undo/redo.

  4. Consistency and Standards: Adhere to conventions to prevent user confusion.

  5. Error Prevention: Design to prevent errors from occurring in the first place.

  6. Recognition Rather Than Recall: Make objects, actions, and options visible to minimize memory load.

  7. Flexibility and Efficiency of Use: Offer accelerators for expert users while accommodating novices.

  8. Aesthetic and Minimalist Design: Avoid irrelevant information in dialogues to reduce clutter.

  9. Help Users Recognize, Diagnose, Recover from Errors: Provide plain-language error messages that suggest solutions.

  10. Help and Documentation: Ensure documentation is easy to search, task-focused, and concise.

28
New cards

Visibility of System Status

Heuristic stating systems should always keep users informed through timely feedback.

29
New cards

Match Between System and Real World

Heuristic requiring language and concepts familiar to users and natural task order.

30
New cards

User Control and Freedom

Heuristic emphasizing undo/redo and easy exits from unwanted states.

31
New cards

Consistency and Standards

Heuristic advising adherence to platform conventions so users don’t wonder if words or actions mean the same thing.

32
New cards

Error Prevention

Heuristic advocating design that avoids problems before they occur or confirms risky actions.

33
New cards

Recognition Rather Than Recall

Heuristic encouraging visibility of objects and options so users don’t rely on memory.

34
New cards

Flexibility and Efficiency of Use

Heuristic supporting accelerators for experts while still accommodating novices.

35
New cards

Aesthetic and Minimalist Design

Heuristic recommending dialogs contain only relevant information to reduce clutter.

36
New cards

Help Users Recognize, Diagnose, Recover from Errors

Heuristic calling for plain-language error messages that suggest solutions.

37
New cards

Help and Documentation

Heuristic noting good systems may still need easy-to-search, task-focused assistance.

38
New cards

Cognitive Walkthrough

Expert method focusing on ease of learning by mentally stepping through tasks from users’ perspectives.

39
New cards

Persona

Fictional archetype representing a key user group’s goals, behaviors and characteristics.

40
New cards

Scenario

Narrative description of users performing tasks with a proposed system, used to explore requirements or design.

41
New cards

Storyboard

Series of sketches illustrating user interaction flow, often accompanying scenarios.

42
New cards

Prototype

Early model of a product—ranging from sketches to interactive software—that allows exploration and testing of ideas.

43
New cards

Low-Fidelity Prototype

Quick, inexpensive representation (e.g., paper, cardboard) used early for conceptual feedback.

44
New cards

High-Fidelity Prototype

Detailed, interactive model resembling final product in look and feel, used for thorough testing.

45
New cards

Medium-Fidelity Prototype (Wireframe)

Skeleton interface showing layout and navigation with limited graphics and functionality.

46
New cards

Wizard-of-Oz Prototyping

Technique where a human simulates system responses unbeknownst to users during early testing.

47
New cards

Conceptual Design

Stage that outlines what users can do with a product and the concepts needed to understand it.

48
New cards

Concrete Design

Stage that specifies detailed interface elements such as colors, icons and layout.

49
New cards

User-Centered Design (UCD)

Design philosophy that incorporates users’ needs, wants and limitations at every phase of development.

50
New cards

Participatory Design

Approach that involves users as active collaborators in the design process.

51
New cards

Agile Interaction Design

Adaptation of agile development emphasizing rapid, flexible, user-focused design iterations.

52
New cards

Hierarchical Task Analysis (HTA)

Method that decomposes tasks into goals, sub-goals and plans to understand user activities.

53
New cards

Contextual Inquiry

Field interview technique treating the user as expert and designer as apprentice to uncover work practices.

54
New cards

Survey

Questionnaire method that gathers structured or open-ended data from many respondents.

55
New cards

Interview (in HCI)

One-on-one or group discussion used to explore users’ needs, attitudes and experiences.

56
New cards

Focus Group

Facilitated group discussion that probes consensus and divergent opinions among users.

57
New cards

Closed Question

Survey or interview item with predefined answer options, enabling easy analysis.

58
New cards

Open Question

Survey or interview item allowing free-form responses for richer qualitative data.

59
New cards

Likert Scale

Rating scale (e.g., 1–5) measuring agreement or frequency in questionnaires.

60
New cards

Observation (Direct)

Watching users in the field or lab to record behaviors and context without relying on self-report.

61
New cards

Indirect Observation

Collection of user activity via diaries, logs or recordings when the evaluator is not present.

62
New cards

Web Analytics

Measurement, collection and analysis of web data to optimize site usage and performance.

63
New cards

Morae

Commercial software suite that records and analyzes user interactions during usability tests.

64
New cards

Data Triangulation

Use of multiple data sources or methods to cross-validate findings and increase credibility.

65
New cards

Pilot Study

Small-scale trial run of a study used to refine procedures and instruments.

66
New cards

Eight Golden Rules of Interface Design

  1. Strive for consistency.

  2. Enable frequent users to use shortcuts.

  3. Offer informative feedback.

  4. Design dialogs to yield closure.

  5. Offer simple error handling.

  6. Permit easy reversal of actions.

  7. Support internal locus of control.

  8. Reduce short-term memory load.

67
New cards

Pragmatic UX

Aspect of user experience concerned with how practical and effective a product is for completing tasks.

68
New cards

Hedonic UX

Aspect of user experience relating to emotional, aesthetic and personal stimulation provided by a product.

69
New cards

Moore’s Law

Observation that the number of transistors on integrated circuits—and thus computing power—roughly doubles every two years.

70
New cards

Fidelity (of Prototype)

Degree to which a prototype resembles the final product in detail and interactivity.

71
New cards

Waterfall Model

Linear software development model with sequential phases and little iteration.

72
New cards

Return on Investment (ROI) in Usability

Business case showing that spending on usability yields cost savings or revenue gains through improved user experience.

73
New cards

Accessibility

Designing products so people with disabilities can perceive, understand, navigate and interact with them.

74
New cards

Inclusiveness

Creating products and services that accommodate the widest number of users regardless of disability, age or context.

75
New cards

Eye-Tracking Metrics

Measurements such as fixation duration and saccades that indicate how users visually process interfaces.

76
New cards

Evaluation Scope

Extent to which study results are generalizable beyond the tested sample or context.

77
New cards

Empirical Measurement

Collection of observable data (e.g., time, errors) to objectively assess usability.

78
New cards

Accelerator

Hidden or advanced shortcut that speeds interaction for expert users without affecting novices.

79
New cards

Discount Usability Testing

Low-cost, rapid testing approach using small sample sizes and simple prototypes.

80
New cards

Competitive Usability Testing

Comparative evaluation of a product against competitor interfaces to benchmark usability.

81
New cards

Can-You-Break-This Test

Exploratory session where participants are encouraged to intentionally find flaws or break the system.

82
New cards

One-Way Mirror

Pane that allows observers to watch participants in a usability lab without being seen.

83
New cards

Task Scenario

Concrete description of goals users must accomplish during a usability test.

84
New cards

Validity Threat

Factor, such as bias or artificial setting, that can compromise the accuracy of evaluation findings.

85
New cards

Stakeholder

Any individual or group who influences or is influenced by a product’s success or failure.

86
New cards

Formative Evaluation

Assessment conducted during development to guide design improvements.

87
New cards

Summative Evaluation

Assessment performed on finished products to judge overall quality against benchmarks.

88
New cards

Data gathering techniques

  • Surveys

  • Interviews

  • Focus Groups

  • Observation (Direct & Indirect)

  • Contextual Inquiry

  • Think-Aloud Technique

  • Data Logging

  • Eye-Tracking

  • A/B Testing

  • Web Analytics

  • Usability Testing (which encompasses many of these methods)

  • Remote Usability Testing

  • Field Study

  • Competitive Usability Testing

  • Can-You-Break-This Test