WEEK 10 Usability Engineering Evaluation Concepts
CS2003 - Usability Engineering
UE Evaluation
Quickfire usability evaluation
Instructor: Dr. Monica Pereira
Email: monica.pereira@brunel.ac.uk
Lecture Overview
Today’s Lecture Topics:
Evaluation: Definition and Purpose
What it is and isn’t
Why we do it
Fit with User-Centered Design (UCD) and Usability Engineering (UE)
Role of Evaluation:
In/Formal evaluation
Goals and factors to consider
Methods of Evaluation:
Discount usability testing
Heuristic evaluation and cognitive walkthrough
Strategies for quick and cheap evaluations
Timing of Evaluation:
When and how often to evaluate in the UCD process
Technical Software Evaluation
Overview
Formal Technical Reviews: Include non-developers
Technical Software Testing:
White-box testing: Basis path and control structure testing
Black-box testing: Focus on functional requirements
Software Testing Strategies:
Unit Testing
Integration Testing
Validation Testing
System Testing: Recovery, Security, Stress, Performance
Note: Focus on software, not user interaction in this module
Understanding Usability Evaluations
Significance of Usability Evaluations
Shift focus from "Does the software achieve its task?" to "How well and in what way does the software assist the user?"
Key question: How to assess and measure usability meaningfully?
Definitions of Evaluation
Evaluation: Process to gather information about system usability to improve or assess the system. (Preece, 1994)
Evaluation Method: Procedure for collecting relevant data on the usability of a computer system. (Preece, 1994)
Key questions addressed:
How to choose between interface alternatives?
How to improve an existing design?
Rationale for Evaluation
Goals of Usability Evaluation
Compare designs and assess engineering toward goals
Check conformance to standards
Understand real-world efficiency and effectiveness
Reasons for Conducting Usability Evaluations
Key Questions to Address:
How effectively does the design facilitate task completion?
Are users able to manage emergency scenarios?
Usability metrics for success and complexity:
Example: How easy is a ticket machine?
Size of screen correlated with selection errors?
Escape mechanisms for erroneous commands?
Rate of user retention and organizational adoption impact?
Establishing Goals for Evaluation
Key Evaluation Goals
Assess system functionality and appropriateness
Evaluate the effect of the interface on user experience
Learning ease, usability, and expectations
Identify specific design problems, including contextual aspects
Key Considerations for Selecting Evaluation Approaches
Factors to Consider
Characteristics of users
Types of user activities
Environment where the study takes place
Nature of the artifact being evaluated
Role of Usability Evaluator
Responsibilities
The evaluator acts as a critical friend to provide constructive criticism to designers
The Trunk Test: Website Evaluation
Concept
Imagine being blindfolded and locked in a trunk. After navigating a website, can you answer these questions?:
What site is this?
What page am I on?
What are the key sections available?
What are my local navigation options?
How can I search?
Common Mistakes in Usability Testing
Common Pitfalls
Overreliance on 'common sense'
Incorrect assumption about typical users (testing only on self)
Failure to involve representative users
Delaying testing until too late
HCI Design Process and Evaluation
Life Cycle of HCI Design
Star life cycle includes:
Requirements specification
Conceptual design
Prototyping
Evaluation (ISO 13407)
Timing of Evaluation
Phases of Evaluation
Formative: Helps in decision-making and identifying issues early
Summative: Conducted at the end of the design process to ensure goals are met
Iterative testing favored over late summative testing
Forms of Usability Evaluation
Three Main Types
Analytical Testing: Expert review, involving opinions of usability experts
Abstract Testing: Drawing from existing data, such as cognitive analysis
Empirical Testing: Direct user testing through qualitative studies
Expert Review Techniques
Quickfire Approaches
Heuristic Evaluation: Scorecard approach assessing against usability principles
Cognitive Walkthrough: Step through user actions to simulate tasks
Formal Usability Inspections: Collaborative meetings to discuss strengths and weaknesses
Analytic Method: Heuristic Evaluation
Process
Experts evaluate the user interface against usability principles
Duration: Typically 1-2 hours
Output: List of usability issues identified
Problem Reporting
Guidelines for Reporting Issues
Each problem should include:
Description
Anticipated user difficulties
Context of the problem
Assumed causes
Key Heuristic Principles
1. Visibility of System Status
Feedback mechanisms based on response time
2. Match Between System and Real World
Use user-friendly terminology and conventions
3. User Control and Freedom
Features like Cancel and Undo should be readily available
4. Consistency and Standards
Maintain user expectations through standard practices
5. Error Prevention
Design to prevent errors from occurring initially
6. Recognition Rather than Recall
Minimize memory usage through visible options
7. Flexibility and Efficiency of Use
Provide shortcuts for experienced users
8. Aesthetic and Minimalist Design
Avoid overloading users with unnecessary information
9. Recognition, Diagnosis, and Recovery from Errors
Clear error messages without technical jargon
10. Help and Documentation
Provide ample resources that are easily accessible and relevant
Evaluation Prioritization
Rating Issues
Problems should be rated based on frequency and impact to prioritize corrective action
Heuristic Evaluation Process Continuation
Obtain Feedback
Experts should combine findings to build a comprehensive issue map
Limitations of Expert Reviews
Lack of ecological validity and potential biases
Discuss the necessity of combining expert reviews with user testing
Observational Testing Approaches
Empirical/User Testing
Evaluate through observing usage patterns
Strategies like 'break it' testing, think-aloud methods, and software logging with attention to user privacy
Conclusion
Summary of Content
Focused on heuristic evaluation and methodologies to catch errors early in usability processes
Acknowledge that real user feedback is critical in validating the design's effectiveness and adaptability