Notes: Critical Thinking, Cognitive Biases, and Challenges to Morality (Lecture on Module 1–2 Transitions)
Quiz logistics and course structure
- Reading quiz 1 details
- Date and time: Monday, September 29 at 09:05 AM
- Format: 15 minutes, 10 multiple-choice questions
- Source: readings drawn from the textbook, not from additional reading folders
- Readings to study: two textbook essays
- Subjectivism by Julia Driver
- Egoism and Moral Skepticism by James Raschals
- Purpose: exercise in critical reading to understand the author’s argument and how it is made
- Where to find readings for the quiz
- Readings are in the textbook (not in the “additional reading” folders)
- No fixed page numbers across volumes or editions; page numbers differ between volumes (and between hardbound and digital copies)
- How to locate essays: open the textbook, go to Table of Contents, find the essay title and author (e.g., Subjectivism by Julia Driver, Egoism and Moral Skepticism by James Raschals)
- Course logistics and communication
- If questions arise, email the instructor for guidance
- Announcements and in-class Q&A occur at start of class; after questions, proceed to Blackboard module folders
- Module 1 wrap-up and transition to Module 2
- End of Lecture 4 of Module 1: Fallacies and biases
- Emphasis: cognitive biases are central, not merely optional topics
- Prior worksheet focus: fallacies; biases were not required on the argument analysis worksheet
- Preview of Module 2: challenges to morality, with Stephen Khan’s editor’s framing in Ethics: Introduction to an Ethic Anthology
Critical thinking review: biases, emotion, and reasoning
- Central claim: people often accept beliefs as true without evidence and reject evidence that contradicts their beliefs
- Cultural context: modern culture tends to prioritize feelings over other considerations when judging right/wrong or true/false
- Emotions vs. reason
- Feelings are not inherently bad; both reason and emotions are important in human life
- Tension: quick, emotionally driven judgments can be efficient but risk error when critical thinking is needed
- Evolutionary perspective (brief)
- Emotions can be adaptive for quick decisions (e.g., sensing danger) but may not yield the best long-term rational conclusions
- Definition of cognitive biases (summary)
- Cognitive biases are systematic impediments to critical reasoning caused by filtering information through personal experience and preferences
- This filtering is a coping mechanism to process large information loads quickly, but it can lead to errors and poor decisions
- Consequences for critical reading and argumentation
- Biases can shape how we evaluate arguments, evidence, and other people
- Openness to alternatives is essential for fair evaluation, not naive acceptance of every view
- Open-mindedness involves willingness to consider being wrong and to defend one's position with reasoned evidence
Key cognitive biases discussed
Dunning-Kruger effect
- Core idea: ignorance of how ignorant we are; overestimate our own knowledge or ability due to lack of self-awareness
- Plain-language takeaway: the more you know, the more you realize how much you don’t know; less knowledge can produce overconfidence
- Implication: recognize areas where you might overestimate competence; practice intellectual humility
- Related concept: confidence may exceed actual competence
- Practical note: acknowledge the limits of one’s knowledge and delay final judgments until more evidence is gathered
Confirmation bias (within the Dunning-Kruger context too)
- Tendency to attend to evidence that confirms what we already believe and discount or ignore evidence that contradicts it
- Example framework: political leadership (election outcomes, media reporting) where supporters/down supporters selectively accept negative/positive reports depending on their stance
- Consequence: reinforces existing beliefs and makes fair evaluation harder
Availability heuristic (availability error)
- Tendency to rely on evidence that is memorable or vivid rather than statistically reliable or broadly representative
- Everyday effect: memorable personal experiences shape beliefs more than systematic evidence
- Example vignette: vivid anecdote about a harmful dog breed can lead to biased beliefs about all dogs of that breed
- Caution: memorable events do not guarantee generalizable or accurate conclusions
Motivated reasoning
- Definition: reasoning performed to support a preconceived conclusion rather than to discover the truth
- Relationship to confirmation bias: related but distinct; motivation drives seeking supportive evidence rather than evaluating all evidence impartially
- Practical concern in research and debate: researchers or speakers may selectively gather or interpret data to confirm what they already believe
Disagreements: belief vs. attitude (Stevenson’s framework)
- Disagreements in belief: competing beliefs about how things are; can be resolved by examining relevant facts and evidence
- Disagreements in attitude: opposed desires or evaluations toward an issue (e.g., supporting vs opposing a policy), often harder to resolve because attitudes influence what evidence is acceptable
- Interplay with evidence: beliefs determine which evidence is considered relevant; attitudes influence how evidence is weighed
Critical reading and self-awareness in argument analysis
- Questions to guide reading and analysis
- Is the author presenting a challenge to morality or reacting to a challenge? (Stevenson’s context in the chapter)
- Are there disputes in belief or disputes in attitude at the core of the disagreement?
- Is there evidence of bias (confirmation bias or motivated reasoning) in the presentation or interpretation of data?
- How does the author handle emotion and reason in moral discourse? Is there an emphasis on open-minded scrutiny?
- The open-mindedness stance
- Open-mindedness does not mean accepting all views; it means being willing to consider alternatives and to defend one’s position with reasoned argument
- The goal is to be able to justify positions and to adjust beliefs when credible evidence warrants adjustment
Module 2 preview: Challenges to morality (editorial framing)
- Textbook structure and editors
- Ethics, introductory anthology: a compilation of essays on ethics
- Editor Stephen Khan (though not the sole author) arranges essays under thematic headings
- Concept of an anthology: a compilation, not a single-author textbook
- What is a “challenge to morality”?
- Challenges to morality question the validity, relevance, or necessity of moral principles
- Questions about the source and nature of morality: tradition, God, human constructs, or other foundations
- Debates about whether morality is essential for human flourishing or well-being
- The debate about the relationship between science and morality: whether science can or should determine moral truths
- The “is/ought” boundary (Hume) and the science-morality tension
- Is-ought problem: facts about the natural world do not straightforwardly yield moral conclusions
- David Hume’s claim: moral values arise from sentiment, not from empirical facts
- Opposing view: science as a moral enterprise that aims to relieve suffering and improve life, suggesting a role for values in scientific inquiry
- The tension: values are not physical properties; science investigates physical properties; thus, some argue science is value-neutral and cannot resolve moral disputes
- Stevenson’s corrective position (as the reading for today’s lecture argues)
- Stevenson contends that ethical disagreement often involves factual disputes and can be resolved by following relevant evidence
- He also argues that science can indirectly influence moral attitudes once beliefs are updated by evidence
- The two-part claim:
- Scientific method can resolve disagreements in beliefs (facts about what is the case)
- Science can guide the conversation toward attitudes, but attitudes are harder to change due to entrenched preferences and biases
- Interpersonal vs. personal moral problems in Stevenson’s framework
- Interpersonal disagreements: between two or more people about what is true; resolved by finding weight of evidence and converging on facts
- Personal disagreements: about what one ought to do in a given situation; more about individual decisions and values
- Practical example: the COVID-19 reopening debate (illustrative, not a value verdict)
- Mary vs. John disagreement in belief about whether to reopen the economy
- Evidence-driven resolution: compare economic impact vs. public health outcomes; weigh the burden of restrictions against the benefits of protection
- Important caution: science is not monolithic; different scientists may interpret data differently; “follow the science” can misrepresent the diversity of scientific opinions
- What Stevenson implies for critical reading and debate
- Use scientific evidence to inform moral reasoning, but recognize that science does not alone settle all moral questions
- Distinguish disputes in belief from disputes in attitude and acknowledge how attitudes shape the reception of evidence
- Acknowledge the complex interplay between facts, values, and emotions in moral discourse
Book structure and upcoming readings
- Next reading assignment for Wednesday
- Tom Reagan, How Not to Answer Moral Questions (in the Blackboard “additional reading” folder for the current edition)
- Task: read with an eye toward identifying the specific challenge to morality Reagan is addressing and how it fits into Khan’s section on Challenges to Morality
- Questions to prepare for the next class
- What challenge to morality is Reagan’s essay addressing?
- How does Reagan argue about the proper way to answer moral questions, and how does that relate to Stevenson’s framework?
Practical implications for exam preparation
- Focus on core concepts
- Definitions and distinctions: belief, attitude, value
- Types of disagreements: in belief vs in attitude
- Cognitive biases described: Dunning-Kruger effect, confirmation bias, availability error, motivated reasoning
- The is/ought problem and the science-morality debate
- Stevenson’s two-part answer: belief-based resolution via evidence; attitudes influenced by beliefs and evidence, with potential indirect change
- Use concrete examples from the lecture
- The Mary-Joe slap scenarios illustrate how context changes moral assessment based on factual details
- The COVID reopening debate as a real-world case showing how evidence can influence beliefs and how attitudes respond to changing beliefs
- Study strategy reminders
- Read the two specified textbook essays carefully (Driver and Raschals)
- Practice identifying whether a claim is a belief or an attitude and what evidence would be relevant to resolve it
- Be wary of assuming science resolves all moral questions; recognize both its power and its limits
- Administrative reminders
- Quiz times and format: 15 minutes, 10 MC questions, from the textbook readings
- Page numbers may differ across editions; rely on title and author to locate essays
- No role call in class anymore; email if you have questions about upcoming material or assignments