1/21
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
metaphor
The structure and function of one thing is used to roughly describe another thing
Mental Chronometry
measuring how long mental processes take by looking at reaction time (RT).
Hermann von Hemholtz
first to measure nerve conduction in a frog’s sciatic nerve
On average he found 24.6-38.4 meters per second
Measured time taken by distinct mental operations
A — Simple RT
one stimulus and one response
no choice, just detect and react
Measures: Basic physiological RT (sensory → motor, no decision needed)
Complexity: Lowest
B — Go / No-Go
Respond ONLY to a designated stimulus; withhold for others
Measures: Adds stimulus IDENTIFICATION time
Complexity: Medium
C — Choice RT
Multiple stimuli, each with its own unique response
Measures: Adds RESPONSE SELECTION time on top of identification
Complexity: Highest
Donders' Subtractive Logic
comparing reaction times from different tasks to estimate how long one mental process takes.
RT of harder MINUS Rt of easier = time for extra mental step
Limitations of Subtractive Logic
Assumes pure insertion (stages don't overlap or interact)
Real stages may run in parallel — violating this assumption
Tasks may differ in more than one stage
The Psychological Refractory Period (PRP) Effect
When two stimuli appear in rapid succession, the response to the SECOND stimulus is slower
This delay is called the Psychological Refractory Period (Welford, 1952)
The effect is stronger when the gap (SOA) between stimuli is shorter
Information Theory
Claude Shannon (1916-2001)
Mathematician not psychologist
Developed tools for telecommunications, like digital circuit theory (1937)
the more uncertain you are before something happens, the more information you gain when you find out what happened
Entropy (H)
Measure of uncertainty or chaos in a set of messages/system
Bits
Unit of information; number of binary yes/no questions needed to identify a message
Channel Capacity
the maximum amount of information a person/system can process accurately at one time.
Low H (Low Entropy)
Sequence is highly predictable Little "surprise" in messages
Example: A book that repeats the letter A on every page → you always know what comes next → H ≈ 0
High H (High Entropy)
Sequence is unpredictable Messages carry a lot of "information"
Example: A book with completely random letter sequences → maximum uncertainty → H is highest
Understanding Bits of Information
Each time you double the alternatives, bits increase by just 1 (logarithmic growth)
This means going from 4 → 8 options adds only 1 bit, not double the information
Cognitive psychology adopted bits to measure 'how much stimulus information' people proces
Why Does More Choice = Slower Response?
When items are presented randomly and equally often, # of alternatives and bits are perfectly confounded!
You cannot tell whether RT increases because of: • More alternatives OR • More information (bits)
The Set-Size Effect
In choice RT tasks, mean RT rises as the number of stimuli alternatives increases
Number of Alternatives
More items = more possibilities to check through. Response slows because there are simply more things to discriminate
Amount of Information
People are actually sensitive to PREDICTABILITY. If items aren't equally likely, unpredictable items slow you down
The Hick-Hyman Law
Predictable stimuli → faster responses
Unpredictable stimuli → slower responses
Reaction time increases as the amount of information or uncertainty increases; people respond to predictability, not just stimulus count
Why the Hick-Hyman Law Mattered
Challenged behaviorism: responses don't just depend on the immediate stimulus — past context matters
People are sensitive to the relative predictability of ALL alternatives, not just what's in front of them
Suggested cognition involves processing probability and prediction — a deeply cognitive, not reflexive, process