Cognitive Biases and Heuristics: Confirmation Bias, Illusory Correlation, Availability & Representativeness

Confirmation Bias

  • Definition: preference for or search for information that confirms existing beliefs or initial attitudes; tends to neglect or undervalue information that could disconfirm those beliefs.
  • Real-world impact: in political topics, people may seek information that supports their views, leading to polarization and reduced openness to objective information that might contradict their beliefs.
  • Automatic vs motivational factors: often automatic processing, but motives can also drive bias (e.g., wanting to feel better about one's own opinions).
  • Relevance to the course: connects to motivational principles and the broader study of biases in judgment and decision making.
  • Bonus paper tip: writing about confirmation bias motives could be a productive topic.

Demonstration of the rule-guessing activity (illustrating seeking information)

  • Setup: the instructor shares a three-number sequence and a hidden rule; students propose triples to test whether they follow the rule, and the instructor answers yes or no.
  • Purpose: to show that seeking information that confirms your rule is not as informative as seeking information that disconfirms it.
  • Key dynamic observed:
    • Most participants propose numbers they think will fit their own rule (confirmation bias).
    • The instructor emphasizes that proposing numbers that do NOT fit their rule (disconfirming evidence) is more informative for narrowing down the rule.
    • The exercise ends with better deduction when the group accepts the goal of falsification rather than confirmation.
  • Link to the Black Swan analogy: the demonstration echoes Nassim Taleb's idea that theories should be challenged by evidence that could falsify them, not just supported by favorable examples.
  • Practical takeaway: in science and critical thinking, aim to disprove your own hypotheses to approach truth rather than seeking only corroborating instances.

The Black Swan concept (Nassim Taleb)

  • Core metaphor: the unknown and the unexpected; historical certainty (e.g., all swans white) was overturned by the discovery of black swans after exploration.
  • Lesson for thinking: people tend to form rules and then seek examples that confirm them; disconfirming evidence is what truly tests theories.
  • Applied message: in all aspects of life, when you think something is true, actively seek evidence that could falsify it to approach a more accurate understanding.
  • Summary connection: the video demonstrates how easy it is to fall into confirmatory patterns and why falsification is central to the scientific method.

Heuristics: quick mental shortcuts in judgment and decision making

  • Definition: mental shortcuts that aid quick judgments when cognitive resources are limited; not always accurate, but efficient.
  • Relationship to biases: heuristics and biases are related concepts; debates exist about how to technically distinguish them, but both describe systematic errors or tendencies in thinking.
  • Why they matter in social psychology: they influence everyday judgments and can explain many social phenomena, including political reasoning.
  • Trade-off: speed and efficiency vs. accuracy; errors occur when heuristics are misapplied or over-relied upon.

Illusory correlation

  • Definition: perceiving a relationship between two variables where no meaningful relationship exists.
  • Classic example (repeated in lectures): vaccines and autism – perceived link despite lack of evidence.
  • Other examples discussed:
    • Urban crime rates vs. population density intuition (perceived effects due to more people, not per-capita rates).
    • Gambler’s fallacy: belief that past outcomes influence future independent outcomes (e.g., coin flips).
  • Causation vs correlation: illusory correlations can sometimes be misconstrued as causal relationships; careful analysis is needed to distinguish causation from mere correlation.
  • Historical/plausible example: medieval plague explanations – a case where people formed illusory connections (e.g., witchcraft) rather than seeking ecological explanations (like fleas and rats). The story of a woman with many cats who remained unaffected illustrates misattribution and the human propensity to find explanations that fit existing ideas.
  • Takeaway: illusory correlations can distort judgment and lead to erroneous beliefs; awareness helps guard against them.

Availability heuristic

  • Definition: estimating the likelihood or frequency of events based on how easily examples come to mind, not on objective statistics.
  • Classic demonstration (letter example): people more readily generate words starting with the letter K than words with K as the third letter, due to ease of retrieval for words beginning with the first letter. This illustrates how ease of recall biases judgments about frequency.
  • Media and memory effects: things that are highly memorable or frequently discussed in media are recalled more easily and thus overestimated in frequency or importance.
  • Everyday implication: judgments about how common or probable something is can be skewed by how available examples are in memory, not by real-world frequencies.

Availability vs. statistics of events (study demonstrations)

  • A common task used in class involved ranking leading causes of death. In the 1970s-1980s, people tended to overestimate causes like car accidents or homicide because those events are frequently depicted in media; underestimating heart disease, which is statistically the leading cause of death, due to lower media salience.
  • Core point: accessibility and vividness of examples drive perceived frequency; actual statistics may tell a different story.
  • Practical implication: be cautious about media-driven impressions of risk and prevalence; seek out base-rate information.

The representative heuristic

  • Definition: judging the probability of an event or category by how similar it is to a typical case (prototype) rather than using base rate information.
  • Example discussed: librarian vs. salesperson. Based on surface traits, one might guess librarian, but the base-rate reality (more salespeople than librarians) should inform judgment. The task highlighted how superficial similarity can mislead category judgments when base rates are ignored.
  • Notes on scope: while the class doesn’t require deep theory on prototype models, this heuristic demonstrates how people rely on stereotypes or prototypes to categorize people quickly.

Video excerpt: broader view on heuristics

  • Content notes: a Crash Course-style video (or similar) presented as supplemental material to reinforce concepts.

  • Core messages from the video:

    • Everyday decisions rely on heuristics (e.g., what to wear, lunch menu choices) for efficiency; not about perfection, but practicality.
    • Availability can lead to bias in global judgments (e.g., perceived rise in violence) even when data shows long-term trends.
    • Rare-disease example and Bayes-like reasoning: even with a positive test result, a low base-rate disease requires careful probability evaluation; a common pitfall is to overestimate the post-test probability of disease when base rates are low.
    • The general call for humility: recognize that intuitive judgments are fallible; listen to opposing views; treat intuitions as starting points, not conclusions.
  • Key formula (Bayesian intuition) discussed in the video: if prevalence is low and a test has a non-negligible false-positive rate, the probability of actually having the disease given a positive result can be very small. This is demonstrated with the numbers: prevalence $P(D)=0.001$, false-positive rate $P(+|
    eg D)=0.10$, and test sensitivity assumed high (often taken as $P(+|D)
    ightarrow 1$ in the example). The posterior probability is:

    P(D|+) = rac{P(+|D)P(D)}{P(+|D)P(D) + P(+|
    eg D)P(
    eg D)}

    Substituting values (with $P(+|D)=1$ and $P(
    eg D)=1-P(D)=0.999$):

    $$P(D|+) = rac{1 imes 0.001}{1 imes 0.001 + 0.10 imes 0.999} \