KI

Decision‐Making Biases & Remedies Study Notes

"How Did I Fall For That?" – General Foundations

  • Human cognition is as vulnerable to mental illusions as human vision is to optical illusions.
  • No person – regardless of intelligence, education, or experience – is naturally immune.
  • Cluster of remedies:
    • AWARENESS → know the biases exist.
    • TRAINING → practice identifying them in real and simulated decisions.
    • MINDFULNESS → pause, reflect, apply critical thought.
  • Biases rarely work in isolation; e.g., overconfidence often pairs with hindsight bias (“I knew it all along”).
  • Meta-rule: ASSUME NOTHING…QUESTION EVERYTHING!

Overconfidence

  • Definition: Systematic tendency to overestimate the accuracy of one’s knowledge, judgments, and forecasts.
  • Heightened when we leave our comfort zone or domain of expertise.
  • Danger sign: phrases like “I’m 99\% sure.”
  • WHY it happens
    • Illusion of superiority (better-than-average effect).
    • Illusion of control over random events.
    • Ignorance of the full set of possible outcomes.
    • Selective memory: successes easily recalled, failures fade.
    • Confirmation-seeking search (feeds confirmation bias).
  • Practical consequence: overly narrow confidence intervals, underestimated risk buffers, and surprise when outcomes deviate.

Inertia Bias (Procrastination)

  • Disposition to postpone or avoid decisions/action.
  • Mechanisms
    • Conflict-avoidance → “analysis paralysis” (continuously gathering data because choice feels painful).
    • Unpleasantness avoidance → delay when short-term costs loom larger than long-term gains.
  • Antidote: make the solution clear-cut and reduce psychological cost of initiating it.

Immediate Gratification (Present Bias)

  • Mirror image of inertia but rooted in the same self-control weakness.
  • Short-term rewards overweighted; future outcomes discounted to near zero.
  • Manifestations: impulsive purchases, under-saving, unhealthy consumption.
  • Economic parallel: hyperbolic discounting V=\frac{A}{1+k\,t} where a high k signals steep present bias.

Anchoring Bias

  • People fixate on an initial value (anchor) and insufficiently adjust for subsequent data.
  • Strongest when no objective benchmarks exist (e.g., pricing exotic derivatives like credit-default swaps pre-2008).
  • Trivial or irrelevant anchors still pull estimates (classic "last 2 digits of SSN" experiment).
  • Guardrail: generate anchors from data, not from arbitrary numbers, and force multiple adjustment rounds.

Selective Perception

  • Under ambiguity, interpretation reflects attitudes, interests, experience, and cultural background more than the stimulus itself.
  • Consequence: two observers can see the same event and walk away with incompatible narratives.
  • Skill: intentionally adopt another perspective (“see through their eyes”).

Confirmation Bias

  • Data collection is subjective—we seek, notice, and recall evidence that supports existing beliefs.
  • Psychological roots
    • Cognitive consistency is pleasurable.
    • Conflicting information creates dissonance (mental discomfort).
    • Reduces complexity: easier to work with confirming evidence than conflicting data.
  • Creates echo chambers and amplifies polarisation.

Framing Bias

  • “Frames” are mental structures that shape meaning; wording or context steers decisions.
  • Losses vs. gains
    • People are risk-averse in the domain of gains, risk-seeking in the domain of losses (Prospect Theory).
  • Three classic types
    1. Risky-choice framing (Asian disease problem)
    • Program A: “Save 200 people.”
    • Program B: \tfrac{1}{3} chance save 600 / \tfrac{2}{3} chance save 0.
    • Equivalent negative frame: C “400 will die” vs. D \tfrac{1}{3} chance none die / \tfrac{2}{3} all 600 die.
    • Majority choose A over B (gain frame) but choose D over C (loss frame)—pure wording shift.
    1. Attribute framing: “75\% lean” vs. “25\% fat.” Same product, different perception.
    2. Goal framing: Breast-exam messages
    • Positive: “Women who DO exams have an increased chance of detecting tumors early.”
    • Negative: “Women who do NOT do exams have a decreased chance of detecting tumors early.”
  • Managerial implication: craft frames ethically; test multiple frames before deciding.

Availability Bias

  • Decisions lean on information that is vivid, recent, or emotionally charged rather than statistically sound.
  • Example: Fear of flying spikes after a plane crash, despite P(\text{car death}) > P(\text{plane death}).
  • Experiences (news, anecdotes) dominate abstract statistics.
  • Leads to distorted risk assessment and resource misallocation (e.g., over-investing in low-frequency hazards).

Representation Bias (Representativeness Heuristic)

  • Probability judged by similarity to a prototype rather than by base rates or laws of randomness.
  • Patterns inferred where none exist: “I’m on a lucky streak.”
  • Regression to the mean
    • Extreme outcomes tend to move toward average next period.
    • Formal: E[X{t+1}\,|\,Xt]=\mu+\rho\,(X_t-\mu),\;|\rho|<1.
  • Sample-size neglect: treating “3/5 successes” as equally credible as “1200/2000 successes.”

Coping with Randomness

  • Humans over-interpret chance events; we crave causal stories.
  • Practical guidelines
    • Accept that coincidences happen without deeper meaning.
    • Resist attributing random events to “fate.”
    • Avoid forming superstitions from illusory patterns.

Sunk Cost Fallacy

  • Treating past, irrevocable costs as reasons to continue an endeavour.
  • Rational rule: only future costs and benefits matter—today’s decisions can’t rewrite the past.
  • Psychological drivers
    • Desire to appear consistent (“saving face”).
    • Aversion to waste.
  • Tip: reframe decision as “fresh project starting today.”

Limited Search & Bounded Rationality

  • Cognitive limits → we simplify by searching only a slice of the alternative space.
  • May satisfice: accept first option that meets a minimally acceptable threshold.
  • Simon’s Bounded Rationality: optimisation is impossible; aim for “good enough.”
  • Check: Does the option align with values, goals, and plans?
  • Einstein reminder: “Everything should be made as simple as possible, but not simpler.”

Emotional Involvement

  • High arousal (stress or excitement) narrows attention and speeds up choices, often impulsively.
  • Negative emotions
    • Constrict cognition, encourage rapid, oversimplified decisions.
  • Positive over-excitement
    • Can lead to overcommitment, ignoring downside risk.
  • Strategy: acknowledge emotion, delay key decisions until emotional intensity subsides or use pre-commitment devices.

Self-Serving Bias

  • Personal outcomes
    • Success → internal attribution (“I’m skilled”).
    • Failure → external attribution (“Bad luck”).
  • Social reversal
    • When judging others’ failures, we downplay situational factors and blame internal flaws.
  • Organizational impact: fosters blame culture and hampers honest post-mortems.

Hindsight Bias

  • After the fact, outcomes appear obvious (“I knew it all along”), masking past uncertainty.
  • Mechanisms
    • Faulty memory reconstruction: we misremember prior probability estimates as more extreme.
  • Consequence: reduces learning—if outcome seemed inevitable, we never probe why it actually happened.

Integrative Checklist – Avoiding Decision Traps

  • Deliberately search for disconfirming evidence.
  • Perform a “bias audit” on yourself: impatience? overconfidence? risk appetite?
  • Prioritise long-term over myopic goals.
  • Experiment with alternative frames—rewrite problem statements.
  • Conduct an empathy exercise: “Walk in someone else’s shoes.”
  • Diversify experiences to expand mental models.
  • Treat extreme performances as temporary; remember regression to the mean.
  • Accept randomness: not every pattern is causal.
  • Push for “outside-the-box” options; widen the choice set beyond the obvious.
  • Manage emotion: build pauses, ask neutral parties, or use decision pre-mortems.