1/40
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
cognitive misers
People look for ways to conserve cognitive energy by attempting to adopt strategies that simplify complex problems
dual process theory
System 1: Automatic processing; fast, intuitive, emotional
System 2: Controlled processing; slow, deliberate, logical
system 1 vs system 2
• Rational system (System 2): Unfamiliar tasks, tasks with a clear right answer, solving unexpected problems, goal pursuit
• Intuitive system (System 1):
Complex decisions: Dijksterhuis (2004) decision studies
Creativity: Mind-wandering promotes creative insight
availability heuristic
When we judge the frequency or probability of some event by how readily pertinent instances come to mind.
Can lead to biased assessments of risk
• Bad news bias: Overestimation of the frequency of dramatic events
ex: overestimated bias: all accidents, motor vehicle accidents, tornadoes, flood, cancers, fire, homicide
Representative heuristic
Representativeness heuristic : when we try to categorize something by judging how similar it is to our conception of the typical member of the category
Ex: Who is the prototypical Asian?
conjunction fallacy
logical error that occurs when people assume that specific conditions are more probable than a single general one.
ex:
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy and was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Participants were then asked which of the following is more probable:
Linda is a bank teller.
Linda is a bank teller and is active in the feminist movement.
*****people chose option 2, even though it's logically less probable for two conditions to be true at once than just one
base rate neglect
cognitive bias where people ignore or undervalue general statistical information (the base rate) in favor of specific, anecdotal, or vivid information.
ex:
A rare disease affects 1 in 1,000 people. A test detects it 99% of the time, but also gives false positives 5% of the time.
Someone tests positive. What's the chance they really have the disease?
Most people say “around 99%,” but that ignores the base rate — the disease is very rare. The real chance is closer to 2%.
This mistake—focusing on the test result and ignoring how rare the disease
Illusory correlations
Thinking that two variables are correlated because of both heuristics working together
halo effect
We tend to generalize our broad impressions to specific qualities about a person
primacy effect (order effects)
information presented first has the most influence
•Occurs when info is ambiguous
Study: Asch (1946) –traits
recency effect (order effects)
information presented last has the more influence
Occurs when the last item comes more readily to the mind
framing effect
The way info is presented can “frame” the way its processed and understood
• Primacy and recency effect = type of framing effect
Spin-framing: Varies the content of what is presented
E.g., “illegal aliens” vs “undocumented workers”
E.g., ”torture” vs “enhanced interrogation”
E.g., “war department” vs “defense department
Positive/ Negative framing
cognitive bias where the way information is presented—as a gain (positive) or a loss (negative)—influences decision-making, even when the facts are the same.
ex: Imagine a treatment for a disease is described in two ways:
Positive frame: "This treatment has a 90% survival rate."
Negative frame: "This treatment has a 10% mortality rate."
pluralistic ignorance
• People mistakenly believe that everyone else holds a different opinion than their own. • Occurs because of a concern for social consequences
Enforces wrong ideas about a group norm
ex: In a classroom, a professor explains something confusing.
No one asks questions, so each student thinks, "I must be the only one who doesn’t get it."
In reality, most students are confused but stay silent, believing they’re alone in their confusion.
self fulfilling prophecy
when a belief or expectation about a person or situation causes you to act in ways that make that belief come true.
ex: A teacher believes a student is gifted, so they give that student more attention and harder tasks.
The student rises to the challenge and performs better—confirming the teacher’s belief.
second hand information
refers to knowledge or accounts that are not directly experienced but are learned from someone else—like through gossip, media, or retelling.
Ideological distortions
bad news bias
ideological distortions
This happens when people present information in a way that supports their own beliefs or agenda, often unintentionally. The facts may be altered, exaggerated, or selectively emphasized to fit a certain ideology.
ex: A political commentator might only highlight facts that support their party’s views and leave out opposing evidence—even if the full picture is more balanced.
bad news bias
This is the tendency for media (or people) to focus more on negative stories—like crime, disasters, or conflict—because they attract more attention, even if such events are rare.
ex: Watching the news might make someone believe the world is more dangerous than it actually is, simply because bad news is more frequently reported.
Confirmation bias
In the social realm
• We often ask questions that will provide support for what we want to know
• Engage in a biased search for evidence
Information that supports what we want to be true is easily accepted; info that contradicts what a person would like to believe is often discounted
bottom up processing
“data-drive” mental processing
An individual forms conclusions based on the stimuli encountered in the environment
top down
“theory-driven” mental processing, where an individual filters and interprets new info in light of preexisting knowledge and expectations.
assimilation
Interpreting new information in terms of existing beliefs. Expectations influence information processing. We see what we expect to see.
• ex: Hastorf & Cantril (1954) – Princeton-Dartmouth football game
Fans from Princeton and Dartmouth both watched the same game, but they interpreted events differently.
Each group assimilated the rough plays and penalties into their existing belief:
“Our team is fair; the other team is dirty.”
Instead of adjusting their beliefs based on the game footage (which would be accommodation), they fit the new information into their existing framework—that their team was in the right.
belief perseverance
Persistence of one’s initial conceptions, even in the face of opposing evidence.
• Andersen et al. (1980) - Firefighter study
Participants read that either risk-taking or cautious firefighters performed better.
Later, they were told the info was made up—but they still believed what they first read.
This shows belief perseverance: people stick to initial beliefs even after learning they’re false.
false consensus
We tend to overestimate how much other people agree with us, especially when it comes to undesirable or questionable behaviors.
Undesirable? Consensus.
ex: "Everyone cheats a little on tests—it’s normal."
This is often used to justify bad behavior by assuming it's common.
false distinctiveness
We tend to underestimate how common our positive traits or behaviors are, believing we’re more unique than we really are.
Desirable? Distinctiveness
ex: "I’m one of the few people who really cares about the environment."
This helps boost self-esteem by making us feel special.
egocentric bias
Tendency to focus on ourselves.
• Better memory for personally-relevant information
• Spotlight effect
bias blind spot
Tendency to believe that we are more objective and less biased than most others.
We have context for our own biases/errors
Pronin et al. (2002): 85% said they were less biased than the average American
Only 1 in 600 said they were more biased
liking gap
After conversations, people underestimate how much their conversation partner likes them.
Boothby et al. (2018): Thoughts about own conversational performance were more negative than partner’s thoughts
thought gap
After conversations, people underestimate how much their conversation partner thinks about them (relative to the reverse).
Cooney et al. (2021): We have more access to our own thoughts, so they loom larger
unrealistic optimism
People tend to believe that good things are more likely to happen to them than to others, and that bad things are less likely.
ex: “I won’t get into a car accident” or “I’ll definitely land my dream job,”
—even when the odds say otherwise.
exception is bracing for the worst: As the “moment of truth” approaches (like a test result or job interview response), people often become more pessimistic.
affective forecasting
How we predict we will feel in the future
We are good at predicting if a future event will make us feel positive or negative, but not as good at predicting the strength or duration of those feelings
impact bias
we overestimate the impact (strength) of positive or negative feelings
You think failing a test will ruin your life and you'll feel terrible for weeks.
→ In reality, you're upset for a day or two and then move on.
You believe getting a new car will make you happy for months.
→ But the excitement fades faster than expected.
durability bias
we overestimate how long we will feel those feelings
You think breaking up will leave you heartbroken for months—but you start feeling better in a couple of weeks.
You believe winning an award will make you happy for a long time—but the thrill fades quickly.
planning fallacy
We underestimate how long things will take us to do
Can make us be more ambitious, but can also lead to all-nighters
causal attributions
Explanations people use for what caused a particular event or behavior.
Ex: Professor to student: “That’s a good point!”
Students: Was it really though? Did she just want to encourage participation?
locus of causality (attribution)
refers to the perceived source or cause of an event or behavior, and it can be either internal or external.
Internal locus of causality (dispositional): Believing that personal factors (like ability, effort, or decisions) are the cause of success or failure.
External locus of causality (situational): Believing that outside factors (like luck, others, or situational factors) are responsible for outcomes.
covariation principle (attributions)
the idea that behavior should be attributed to potential causes that occur along with the observed behavior
• People determine locus of causality in terms of things that are present when the event occurs but absent when it does not.
Consistency across situations and time = dispositional
Consistency across people, not situations and time = situational
consensus
distinctiveness
consistency
stability of causality (attributions)
Is the cause of the event or behavior likely to repeat itself in a similar situation?
Entity theorists : People who tend to see personal characteristics (e.g., intelligence, personality) as stable.
Incremental theorists : People who tend to see personal characteristics as unstable and changeable.
Optimistic explanations
Explanatory style : a person’s habitual way of explaining events 3 dimensions
1. Internal/external
2. Stable/unstable
3. Global/specific
counterfactuals
Counterfactual thoughts: Imagining scenarios that differ from what actually happened.
Upward counterfactuals: “If only” (better alternative).
Arise after bad events; make us feel worse, motivate change.
Downward counterfactuals: “At least” (worse alternative).
Arise after good events; make us feel better, motivate repeated behavior
attribution errors
Correspondent inference : The tendency to make dispositional attributions for others’ behavior.
Also called the fundamental attribution error
Ross et al. (1977): Quiz show study
Cultural differences:
Collectivistic cultures = more situational attributions
Lower socioeconomic status = more situational attributions