Chapter 1-8 Notes: Biases, Egocentrism, and Belief Perseverance (Vocabulary Flashcards)

Why should psychology be a science?

  • Personal theories vs science:

    • The lecture begins by challenging why psychology needs to be a science beyond personal guesses about people and groups (e.g., what are Black people like? what are White people like? what should be done in parenting or sex). These are framed as personal theories, not universally valid explanations.

    • If everyone has different personal theories, some must be wrong, especially when beliefs concern broad groups (e.g., politics, religion, child-rearing).

    • Sources of personal theories are often anecdotal data (questionable reliability): Wikipedia, friends, family, or one’s own life experiences in a specific place (e.g., Miami-Dade). The lecture emphasizes that these sources are often unreliable.

    • the Conclusion: science is needed to systematize data, reduce bias, and provide shared standards for evaluating claims.

  • Biases of belief and the two “nails in the coffin” for personal theories:

    • The speaker teases a broader argument for why science matters, then sets up a focus on how our thinking is biased before getting into methods.

Biases and key concepts: egocentrism, egotism, and self-serving bias

  • Egocentrism: perceptual bias

    • Definition: the tendency to perceive the world from the perspective of the self.

    • Core idea: you can only see the world through your own eyes; you are the center of your universe.

    • Developmental note (Piaget): kids are egocentric; only later (roughly around 5-7 years) do they start decentering and recognizing that others may see the world differently.

    • Classic demonstration (Piaget’s mountain task):

    • A child sits on one side of a model mountain; a doll is placed on the opposite side. The child often reports what the doll sees as what the child sees, not realizing the doll has a different perspective.

    • Everyday examples:

    • Kid blocking TV because they think you can see what they see.

    • Adults who think others share their beliefs and experiences.

    • Real-world trajectory: egocentrism often recedes in early schooling (around 5-7) but becomes more nuanced in adolescence and adulthood.

    • Adulthood examples from lecture:

    • People act as if they are causal agents who control events they do not fully control (e.g., blaming the weather or outside events on their own actions).

    • A neighbor who believes he controls weather by his actions (e.g., shutters up or down) and acts as if his behavior determines external outcomes.

    • Superstitions and “lucky” behaviors (e.g., lucky underwear for games) reflect egocentric thinking that one’s own actions can influence outcomes that are largely random.

    • On Sundays, a fan may suddenly pray for a field goal, implying divine intervention in a sports outcome they believe they can influence in the moment, which the speaker calls egocentric in a humorous way.

    • Clinical note: egocentrism is a perceptual bias, not a moral failing; many people show egocentric thinking in different domains.

    • Related observational trope: some people act as if “the world revolves around them” and treat others as if they know what they know or see what they see.

  • Egotism: motivated bias (also called egocentric interpretation)

    • Distinction from egocentrism:

    • Egocentrism is about perception (how you see the world).

    • Egotism is about interpretation (how you tell a story about the world to make yourself look good).

    • Key question: when narrating events involving oneself, do you cast yourself as the hero or the goat?

    • Everyday illustrations:

    • In fights with friends or parents, who caused the problem? The other person may blame the other party, but egotistical interpretation aims to present oneself positively.

    • Pop culture and political examples from the lecture emphasize how people reinterpret events to maintain a favorable self-view.

    • Connection to pride and self-image: people often present themselves in the best possible light, especially in social interactions or when recounting failures or conflicts.

  • Self-serving bias (claim and blame): the practical manifestation of egotism

    • Core idea: claim successes, blame away failures.

    - Short form:

    • Claim successes

    • Blame failures on external factors or others

      • Test-score example from lecture:

      • After a tough test, students may react by blaming the test, the book, or the instructor rather than recognizing gaps in their preparation.

      • The instructor uses a dramatic example: after receiving an F, a student might blame everything else rather than their own study or effort.

      • Political and leadership examples:

      • The discussion of political figures who claim credit for successes but shift blame to others when things go wrong.

      • Real-world dynamic: many people intellectually know they should take responsibility, but emotionally they prefer to preserve a positive self-image by attributing success to themselves and failure to external circumstances.

  • Distinction among the three biases and their interplay

    • Egocentrism (perceptual bias): how you see things from your own point of view.

    • Egotism (motivational bias): how you interpret events to favor yourself.

    • Self-serving bias (a specific form of egotism): how you narrate outcomes to maximize self-worth (claiming success, blaming failure).

    • The lecturer uses political, sports, and everyday life examples to illustrate how these biases show up in real settings.

    • The speaker also notes that people can be both egotistical and egocentric at the same time (e.g., a narcissistic individual).

  • The role of cognitive conservatism and belief presverance

    • Conservatism (in this context) refers to resistance to change and risk; a belief that one should not alter current beliefs quickly.

    • This connects to belief perseverance, assimilation, and other cognitive biases that keep people attached to their beliefs even in the face of contradictory evidence.

How biases protect beliefs: assimilation and belief perseverance

  • Assimilation (a passive information filter)

    • Definition: taking in new information in a way that is consistent with what you already believe.

    • Mechanism: information passes through a cognitive filter (your baggage of beliefs, attitudes, stereotypes, etc.) which can alter how new data is interpreted.

    • Everyday examples:

    • After two different groups watch the same game, they may report seeing very different things (perception shaped by allegiance to their team).

    • The classic mountaineering example: two different broadcasts of the same event (Marlins vs Braves) lead to different interpretations of a call because of viewers’ allegiance.

    • Summary phrase: “I wouldn’t have seen it if I didn’t believe it.”

    • Active vs passive: assimilation is often passive; you don’t realize you are filtering information.

    • Broader implication: assimilation can shape how people interpret events in politics, media, and everyday life, leading to persistent misperceptions when faced with conflicting data.

  • Belief perseverance (not covered deeply as a separate name in all contexts, but discussed here)

    • Definition: continuing to hold onto a belief even after it has been contradicted by evidence.

    • The lecture notes belief perseverance in the context of political rhetoric (e.g., the 2020 election claims) and other belief systems.

    • Mechanism: once a belief is adopted, people generate and seek evidence that supports it and dismiss disconfirming data, reinforcing their stance.

    • Example snippets from the lecture include claims about the 2020 election and the persistence of those beliefs despite multiple court cases.

    • The concept is linked to assimilation and other protective biases: even when evidence disconfirms a belief, people will often persist due to cognitive filters and motivated reasoning.

  • Causative conservatism (a label used in the lecture)

    • Definition and usage: a form of conservatism that aims to preserve existing beliefs by constructing causal explanations that fit what one already believes.

    • It helps explain why people maintain inaccurate or unsupported beliefs even when faced with counterevidence.

  • Illustrative examples of belief-protecting mechanisms

    • The Mary apparition example in Clearwater, Florida: rational explanations (sprinkler alignment, glass discoloration) coexist with strong religious belief that a miracle occurred; believers may persist despite rational arguments.

    • The “gremlin” example from aviation history: pilots who lose a plane often attribute it to unseen gremlins rather than systemic or mechanical explanations, illustrating an instinct to explain misfortune with an external, controllable cause rather than randomness or error.

    • The “you can’t see them” claim in belief discussions: once someone asserts invisibility (e.g., gremlins or angels), the conversation becomes irreconcilable because proving nonexistence is not feasible.

The role of skepticism, testing, and how psychology views belief

  • The psychologist’s stance on judgment vs explanation

    • The instructor emphasizes that psychology is about explaining behavior, not judging people as good or bad.

    • The goal is understanding why people do what they do, not condemning them for their beliefs.

    • This approach contrasts with some other fields (e.g., criminal justice) that focus more on judgment.

  • The test-as-trick demonstration on social sensitivity

    • The instructor presents a staged “test” about social sensitivity to illustrate how tests can be used to prime beliefs and then reveal those beliefs in interaction.

    • The setup suggests that people will score in ways that confirm their expectations, illustrating self-serving or bias-confirming responses.

    • The takeaway: people can be predisposed to interpret themselves as socially sensitive even when evidence contradicts that self-perception, highlighting perception vs reality in social judgments.

  • The metacognitive takeaway: why it’s hard to change minds

    • People have built-in defenses that protect their beliefs (assimilation, confirmation bias, belief perseverance).

    • Once a belief is established, it’s hard to dislodge because people actively search for confirming evidence and dismiss disconfirming data.

    • The term “causative conservatism” captures this tendency to maintain beliefs with causal explanations that fit prior commitments.

Concrete examples and anecdotes used in the lecture

  • Personal and social anecdotes to illustrate biases:

    • Home and family dynamics: egocentrism starting in childhood with Piaget’s mountain task.

    • Sports fandom and superstition: the idea of luck, ritual behavior, and selective interpretation of events in game outcomes; denial of influence of personal actions on randomness; incongruities such as a fan blaming a venue or a seat for a loss.

    • Everyday miracles and rational vs religious explanations: miraculous appearance stories (Mary, Jesus in clouds, etc.) and the difficulty of disproving nonexistence or non-physical phenomena.

    • Public figures and blame-shifting: claims of success vs blame for failures; the difference between what people claim and what actually happened; the phenomenon of “I did that” vs “it was someone else’s fault.”

  • Statistical and survey references from the day’s anecdotes:

    • Heaven-belief survey (percentages):

    • 66\% said Oprah would go to heaven; 52\% Princess Diana; 28\% Dennis Rodman; 19\% O.J. Simpson; the top pick overall was Mother Teresa with 79\%.

    • The top-level claim that many people believe they are going to heaven, often more than certain well-known figures, illustrating optimistic self-perception and egotism.

    • A humorous aside about SAT scores recall and misremembering, illustrating belief/perception gaps.

    • A playful nod to contemporary politics and media coverage, showing how belief can be reinforced by selective information and group identity.

Connections to foundational principles and real-world relevance

  • The scientific method vs anecdotal reasoning:

    • Emphasizes the need for controlled data collection, replication, and skepticism about sources of information.

    • Demonstrates how personal theories and anecdotal data can lead to erroneous conclusions if untested or unrepresentative.

  • Cognitive biases and everyday decision-making:

    • Egocentrism, egotism, and self-serving bias shape how we perceive events, interpret others’ actions, and narrate our own role in outcomes.

    • Assimilation and belief perseverance illustrate why people resist changing beliefs even when faced with contradictory evidence.

    • The content shows how these biases affect personal decisions (e.g., sports fandom, politics, and moral judgments) and societal issues (e.g., rape survivors, accident accountability).

Ethical, philosophical, and practical implications

  • Ethical implications:

    • Acknowledging bias is essential for responsible research and for fair interpretation of human behavior.

    • It’s important not to justify harmful beliefs by rationalizing them through biased explanations (e.g., unjust blaming of victims of rape).

    • The psychologist’s stance emphasizes understanding rather than moral judgment, which has practical implications for therapy, education, and public discourse.

  • Practical implications for research and teaching:

    • Encourage students to distinguish between perception (egocentrism) and interpretation (egotism), and to recognize self-serving biases in evaluating evidence.

    • Teach about assimilation and belief perseverance to help students critically evaluate information, especially in the era of rapid information flows and misinformation.

  • Real-world caution:

    • Be aware that people may present themselves in a favorable light and attempt to protect their self-image through selective reporting.

    • When discussing controversial topics, acknowledge the role of cognitive biases and encourage evidence-based analysis instead of moral judgments.

Summary of key terms and distinctions

  • Egocentrism (perceptual bias): tendency to view the world from one’s own perspective; kids are classic examples; decentering occurs around 5-7 years old.

  • Egotism (motivational bias): bias in interpreting events to cast oneself in a favorable light.

  • Self-serving bias: claim successes, blame away failures; a specific form of egotism.

  • Assimilation: passively integrating new information that is consistent with existing beliefs; can filter or distort data.

  • Belief perseverance: maintaining beliefs despite contrary evidence; reinforced by selective evidence gathering.

  • Causative conservatism: framework to describe how people protect and preserve their causal explanations for beliefs.

  • Conservatism (in thinking): resistance to change; preference for the status quo.

Final takeaway

  • Psychology benefits from being a science by providing structured analysis of behavior that helps us understand why people think and act the way they do, rather than simply relying on personal theories or anecdotes. Recognizing egocentrism, egotism, self-serving bias, assimilation, belief perseverance, and related concepts helps us interpret human behavior more accurately and ethically in real-world contexts.