Cognitive Ergonomics and Design Thinking Notes
Cognitive Ergonomics and Design Thinking
- Cognitive ergonomics and design thinking are applicable to complex systems where human lives are at stake.
- Most accidents are attributed to human error but often result from poor design.
- Principles for high-quality, human-centered design are relevant for user enjoyment and can save lives.
Three Mile Island Accident
- On March 28, 1979, the Three Mile Island nuclear accident occurred, the most severe in a civilian nuclear reactor in the U.S.
- A minor malfunction in unit TMI-2 led to a core meltdown and the release of radioactive gas.
- Sequence of Events:
- A minor fault in the secondary cooling circuit caused the primary coolant temperature to rise.
- The reactor automatically shut down via an emergency stop in about one second.
- Even when shut down, the reactor generates residual heat from radioactive material decay, which must be dissipated through the cooling system.
- Pressure in the primary cooling system rose, causing a safety valve to open. However, the valve did not close when the pressure dropped.
- The control panel did not indicate the valve's actual state (open or closed), only whether it was powered.
- A large portion of the primary coolant was lost, forming steam bubbles in the pipes, preventing adequate cooling and causing the core temperature to rise, leading to severe damage.
- Radioactive gases were released into the atmosphere.
- Usability Issues:
- Investigators found significant usability issues in the control room design.
- The user interface was a critical factor in the failure to prevent the catastrophe.
- The status indicator only showed if the valve was powered, not its actual state (open or closed).
- Operators mistakenly believed the valve was closed when it was open, remaining unaware of the disaster for several hours, resulting in extensive damage.
- Initial Blame:
- Initially, blame was placed on operators for 'incorrect diagnosis of the problems in the plant' (human error).
- Researchers' Conclusion:
- The mistake could hardly be attributed to the operators.
- The accident resulted from equipment failure and flaws in the control room design.
Blaming 'Human Error' Instead of Design
- Norman (2002): Control panels of many power plants looked deliberately designed to cause errors.
- The control room and computer interfaces at Three Mile Island could not have been more confusing if they had tried.
- Belgian Railway accidents Pécrot & Buizingen
Knowledge in the World
- The object/system must display/show critical aspects needed for proper use.
- Knowledge must be present both in our minds and in the world.
- Basic Principle: Knowledge in the world.
- Design Reflection: The design must reflect the essence of:
- The object’s intended use.
- How it works.
- The actions that can be taken.
- Feedback: show what the object/device is doing at a given moment.
- Approach design as communication with the user.
- The designer must have extensive knowledge about the future user. Who am I communicating with?
Design Principles
- Problem: The buyer of an object or system is often ≠ user.
- Factors in Purchasing Department:
- Price
- Personal relationships with supplier
- Reliability of the product
- Seldom: USABILITY
- Even when buying for yourself… you as a buyer ≠ you as a user.
- Company Context: Strong ‘tools’ for designers:
- Mapping
- Conceptual models
- Feedback
- Constraints
- Affordances
Mapping
- Mapping: Three spaces…
- Radio Buttons
- O Radio button 1 O Radio button 2 O Radio button 3
- Rooms
- So…which way do I go?
- Rooms
- Radio Buttons
- O Radio button 1 O Radio button 2 O Radio button 3
Conceptual Models
- To understand how something works, we need conceptual models of how those things function.
- Most conceptual models are oversimplified, yet adequate representations of how objects operate.
*Conceptual models: Microwaves cook by using electromagnetic waves, which are absorbed into the molecules of water, sugar and fat in food. This action causes them to vibrate, which creates heat to cook food fairly evenly, from the inside out. - Most conceptual models are oversimplified, yet adequate representations of how objects or systems operate, but not always the case…
- home thermostat on/off
- car
- how it really works vs. conceptual model (wrong vs. correct)
Conceptual Models and Communication
- A device/interface should 'explain itself and how it works' (communication).
- Implication:
- If the designer doesn't provide us with a conceptual model,
- users are forced to create one themselves
- and those are prone to errors.
- Three Aspects of Mental Models:
- The design model: the conceptualization the designer has in mind.
- The user’s model: what the user develops to explain the operation of the system. Ideally, the user’s model and the design model are equivalent.
- The system image: the physical appearance, operation, response, manuals, and instructions. It’s how users communicate with the designer.
- The designer must ensure that everything about the product is consistent with and exemplifies the operation of the proper conceptual model.
- Most important is a good conceptual model that guides the user when things go wrong.
Feedback
- It’s crucial to show the effect of an action.
- Without feedback, we’re left wondering whether anything actually happened:
- ‘Did I press the button hard enough?’ (e.g., phone)
- Is the machine even working?’ (often leading to risky actions like opening a safety cover)
- Without feedback:
- we often shut down machines in inappropriate ways,
- unnecessarily restart them (e.g. a computer, causing loss of unsaved work),
- or repeat a command, resulting in duplicate outcomes.
- Feedback is essential!
Constraints
- The best way to make something user-friendly and to minimize errors is by making it impossible to take any other action - in other words, by limiting the available choices.
- e.g., car with automatic transmission: start in P
- e.g., placement of batteries in electronic devices: why no design that only allows one correct way of inserting them?
Constraints and Design Flaws
- Consequences of design without ‘constraints’:
- faulty actions
- massive presence of pictograms/signs that attempt to bring across correct instructions (but incomprehensible, unreadable due to size and same color as the device because of embossed print, etc…)
- Norman (2002): “Rule of thumb: when instructions have to be pasted on something (push here, insert this way, turn off before doing this), it is badly designed.”
Affordances
- Definition (Norman-style): Affordances are the ‘possible actions’ that an object or interface allows a user to perform — whether those actions are obvious or not.
- A door handle ‘affords pulling’, while a flat metal plate on a door ‘affords pushing’.
- In digital interfaces, a slider affords dragging, a button affords pressing/clicking.
- Affordances exist even if they are not visible. To be effective, the affordance needs to be discoverable / perceivable.
- If an affordance cannot be perceived, some means of signaling its presence is required.
Perceived Affordances and Signifiers
- “A good designer makes sure that appropriate actions are perceptible and inappropriate ones invisible.”
- Concept “perceived affordances”
- If an affordance cannot be perceived, some means of signaling its presence is required.
- Definition: Signifiers are visual, auditory, or tactile cues that indicate where actions should take place and how they should be performed.
- Examples:
- Signs, labels, and drawings placed in the world (“push” / “pull” / exit on doors, or arrows or instructions).
- The perceived affordances (a handle or the shape of a switch).
- Sometimes signifiers are misleading.
Visibility and Error Prevention
- Designing for visibility means that just by looking, users can see the possibilities for action. Visibility is often violated to make things "look good.“
- Norman’s central point: Good design makes use of clear affordances and signifiers, so users don’t have to guess how something works. Misleading affordances or poor signifiers can result in confusion, frustration, and errors.
- “Make important parts visible”
- Visibility, also known as discoverability: Good starting point:
- “If an error is possible, someone will make it. The designer must assume that all possible errors will occur and design to minimize the chance of the error in the first place or its effects once it gets made. Errors should be easy to detect, they should have minimal consequences, and, if possible, their effects should be reversible.”
Cognitive Failure
- Capture errors: a familiar routine seemingly “captures” and takes over an unfamiliar activity
- Description errors: when the descriptors or locators of the correct (safe) and incorrect (at-risk) execution are similar
- Loss-of-activation errors: the cue or activator that got the behavior started was lost or forgotten
- = 3 types of UNINTENTIONAL failure/errors
- = “COGNITIVE BRAIN CRAMPS”
- “Cognitive brain cramps” unintentional
- “Mistakes” well-intentioned, but poor judgment (foute afstandsinschatiing bij mist)
- These cognitive failures are very different from “Calculated risks” deliberately deciding to take a calculated risk, knowing it’s unsafe
Design Complexity and User Experience
- Why is the GOOD design on the left and the BAD design on the right of the display?
- Keep IT simple
*Problem: Decision making under time stress
- Decision complexity: The speed with which an action can be selected is strongly influenced by the number of possible alternative actions that could be selected.
*design - User Preferences: The Beaten Path