1/87
A comprehensive set of vocabulary flashcards capturing key terms and definitions from CSCC10 lectures on Human-Computer Interaction, including design processes, evaluation methods, usability principles, prototyping, and ethics.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Human-Computer Interaction (HCI)
Interdisciplinary field studying how people interact with computers in order to design usable, effective and enjoyable systems.
Usability
Quality attribute that measures how easy a user interface is to learn, use, remember, error-free and satisfying.
Universal Usability
Design approach that aims to make interactive systems usable by the widest possible range of people regardless of ability, age, culture or technology.
Iterative Design
Cyclic process of prototyping, testing, analyzing and refining a product until it meets user needs.
Heuristic Evaluation
Expert review method where evaluators examine an interface against a set of recognized usability principles (heuristics).
Usability Testing
Empirical method that observes representative users performing tasks with a product to identify problems and measure performance.
Ethics (in HCI)
Standards and practices that protect participants’ rights, privacy and well-being during research and evaluation.
Institutional Review Board (IRB)
Authoritative body that reviews and approves research protocols involving human participants for ethical compliance.
Informed Consent Form
Document that explains study purpose, procedures, risks and rights, which participants sign voluntarily before taking part.
Controlled Setting
Laboratory or similar environment where evaluators tightly manage variables during user studies.
Natural Setting
Real-world environment (field or ‘in the wild’) where products are evaluated in authentic use contexts.
Living Lab
Instrumented real-life environment (e.g., Aware Home) used to study long-term technology use with minimal intrusion.
Usability Laboratory
Dedicated facility with a testing room for participants and an observation room for evaluators, often separated by a one-way mirror.
Data Logging
Automatic recording of user interactions (clicks, errors, timing) for later analysis.
Think-Aloud Technique
Method where users verbalize their thoughts while performing tasks, providing insight into reasoning and struggles.
A/B Testing
Experimental method that compares two interface versions with real users to determine which performs better.
Remote Usability Testing
Usability evaluation conducted with participants at a distance using screen-sharing or logging tools.
Field Study
Observation of users in their own environment to understand natural behaviors and context of use.
Eye-Tracking
Technology that records where and how long users look at interface elements to uncover attention patterns.
Analytics
Collection and interpretation of large-scale usage data (e.g., Google Analytics) to evaluate traffic and user behavior.
Reliability (in evaluation)
Degree to which a method produces consistent results across occasions.
Validity
Extent to which an evaluation method measures what it intends to measure.
Ecological Validity
How accurately evaluation results represent real-world usage conditions.
Bias
Systematic distortion in data collection or interpretation that misrepresents true user behavior.
Scope (of results)
Range over which evaluation findings can be generalized to other users, tasks or contexts.
Jakob Nielsen
Usability expert who formulated the 10 usability heuristics and popularized heuristic evaluation.
Nielsen’s 10 Heuristics
Visibility of System Status: Keep users informed about what is going on through appropriate feedback.
Match Between System and Real World: Speak the users' language, using familiar concepts and a natural order.
User Control and Freedom: Provide clear 'emergency exits' and support undo/redo.
Consistency and Standards: Adhere to conventions to prevent user confusion.
Error Prevention: Design to prevent errors from occurring in the first place.
Recognition Rather Than Recall: Make objects, actions, and options visible to minimize memory load.
Flexibility and Efficiency of Use: Offer accelerators for expert users while accommodating novices.
Aesthetic and Minimalist Design: Avoid irrelevant information in dialogues to reduce clutter.
Help Users Recognize, Diagnose, Recover from Errors: Provide plain-language error messages that suggest solutions.
Help and Documentation: Ensure documentation is easy to search, task-focused, and concise.
Visibility of System Status
Heuristic stating systems should always keep users informed through timely feedback.
Match Between System and Real World
Heuristic requiring language and concepts familiar to users and natural task order.
User Control and Freedom
Heuristic emphasizing undo/redo and easy exits from unwanted states.
Consistency and Standards
Heuristic advising adherence to platform conventions so users don’t wonder if words or actions mean the same thing.
Error Prevention
Heuristic advocating design that avoids problems before they occur or confirms risky actions.
Recognition Rather Than Recall
Heuristic encouraging visibility of objects and options so users don’t rely on memory.
Flexibility and Efficiency of Use
Heuristic supporting accelerators for experts while still accommodating novices.
Aesthetic and Minimalist Design
Heuristic recommending dialogs contain only relevant information to reduce clutter.
Help Users Recognize, Diagnose, Recover from Errors
Heuristic calling for plain-language error messages that suggest solutions.
Help and Documentation
Heuristic noting good systems may still need easy-to-search, task-focused assistance.
Cognitive Walkthrough
Expert method focusing on ease of learning by mentally stepping through tasks from users’ perspectives.
Persona
Fictional archetype representing a key user group’s goals, behaviors and characteristics.
Scenario
Narrative description of users performing tasks with a proposed system, used to explore requirements or design.
Storyboard
Series of sketches illustrating user interaction flow, often accompanying scenarios.
Prototype
Early model of a product—ranging from sketches to interactive software—that allows exploration and testing of ideas.
Low-Fidelity Prototype
Quick, inexpensive representation (e.g., paper, cardboard) used early for conceptual feedback.
High-Fidelity Prototype
Detailed, interactive model resembling final product in look and feel, used for thorough testing.
Medium-Fidelity Prototype (Wireframe)
Skeleton interface showing layout and navigation with limited graphics and functionality.
Wizard-of-Oz Prototyping
Technique where a human simulates system responses unbeknownst to users during early testing.
Conceptual Design
Stage that outlines what users can do with a product and the concepts needed to understand it.
Concrete Design
Stage that specifies detailed interface elements such as colors, icons and layout.
User-Centered Design (UCD)
Design philosophy that incorporates users’ needs, wants and limitations at every phase of development.
Participatory Design
Approach that involves users as active collaborators in the design process.
Agile Interaction Design
Adaptation of agile development emphasizing rapid, flexible, user-focused design iterations.
Hierarchical Task Analysis (HTA)
Method that decomposes tasks into goals, sub-goals and plans to understand user activities.
Contextual Inquiry
Field interview technique treating the user as expert and designer as apprentice to uncover work practices.
Survey
Questionnaire method that gathers structured or open-ended data from many respondents.
Interview (in HCI)
One-on-one or group discussion used to explore users’ needs, attitudes and experiences.
Focus Group
Facilitated group discussion that probes consensus and divergent opinions among users.
Closed Question
Survey or interview item with predefined answer options, enabling easy analysis.
Open Question
Survey or interview item allowing free-form responses for richer qualitative data.
Likert Scale
Rating scale (e.g., 1–5) measuring agreement or frequency in questionnaires.
Observation (Direct)
Watching users in the field or lab to record behaviors and context without relying on self-report.
Indirect Observation
Collection of user activity via diaries, logs or recordings when the evaluator is not present.
Web Analytics
Measurement, collection and analysis of web data to optimize site usage and performance.
Morae
Commercial software suite that records and analyzes user interactions during usability tests.
Data Triangulation
Use of multiple data sources or methods to cross-validate findings and increase credibility.
Pilot Study
Small-scale trial run of a study used to refine procedures and instruments.
Eight Golden Rules of Interface Design
Strive for consistency.
Enable frequent users to use shortcuts.
Offer informative feedback.
Design dialogs to yield closure.
Offer simple error handling.
Permit easy reversal of actions.
Support internal locus of control.
Reduce short-term memory load.
Pragmatic UX
Aspect of user experience concerned with how practical and effective a product is for completing tasks.
Hedonic UX
Aspect of user experience relating to emotional, aesthetic and personal stimulation provided by a product.
Moore’s Law
Observation that the number of transistors on integrated circuits—and thus computing power—roughly doubles every two years.
Fidelity (of Prototype)
Degree to which a prototype resembles the final product in detail and interactivity.
Waterfall Model
Linear software development model with sequential phases and little iteration.
Return on Investment (ROI) in Usability
Business case showing that spending on usability yields cost savings or revenue gains through improved user experience.
Accessibility
Designing products so people with disabilities can perceive, understand, navigate and interact with them.
Inclusiveness
Creating products and services that accommodate the widest number of users regardless of disability, age or context.
Eye-Tracking Metrics
Measurements such as fixation duration and saccades that indicate how users visually process interfaces.
Evaluation Scope
Extent to which study results are generalizable beyond the tested sample or context.
Empirical Measurement
Collection of observable data (e.g., time, errors) to objectively assess usability.
Accelerator
Hidden or advanced shortcut that speeds interaction for expert users without affecting novices.
Discount Usability Testing
Low-cost, rapid testing approach using small sample sizes and simple prototypes.
Competitive Usability Testing
Comparative evaluation of a product against competitor interfaces to benchmark usability.
Can-You-Break-This Test
Exploratory session where participants are encouraged to intentionally find flaws or break the system.
One-Way Mirror
Pane that allows observers to watch participants in a usability lab without being seen.
Task Scenario
Concrete description of goals users must accomplish during a usability test.
Validity Threat
Factor, such as bias or artificial setting, that can compromise the accuracy of evaluation findings.
Stakeholder
Any individual or group who influences or is influenced by a product’s success or failure.
Formative Evaluation
Assessment conducted during development to guide design improvements.
Summative Evaluation
Assessment performed on finished products to judge overall quality against benchmarks.
Data gathering techniques
Surveys
Interviews
Focus Groups
Observation (Direct & Indirect)
Contextual Inquiry
Think-Aloud Technique
Data Logging
Eye-Tracking
A/B Testing
Web Analytics
Usability Testing (which encompasses many of these methods)
Remote Usability Testing
Field Study
Competitive Usability Testing
Can-You-Break-This Test