week 4 phi 108 Chapter Three: Informal Logic—Habits of Thinking
Chapter Three: Informal Logic—Habits of Thinking
Informal logic, as discussed in the notes, focuses on the psychological factors and thinking habits that can either help or hinder effective reasoning in everyday situations.
This chapter examines thinking habits that can either hinder or facilitate finding truth, effective communication, and problem-solving. These are called 'habits' rather than 'rules' because exceptions exist, though they are rare. Bad habits often stem from psychological factors (fears, motivations, attitudes) or problematic beliefs. While not inherently wrong (unlike fallacies), they weaken thinking and make one vulnerable to manipulation.
3.1. Self-Interest
Definition: Self-interest on its own is not inherently bad; most decisions are partly based on personal benefit.
Problematic Self-Interest: It becomes a problem when arguments or worldviews are advanced only because of personal benefit, without other reasons.
Historical and Theoretical Context: Self-interest has a place in specialized reasoning areas:
Aristotle: Claimed everyone desires happiness by nature, forming the basis of his ethics.
John Stuart Mill: Made 'utility' (pleasure or personal benefit) the basis of Utilitarianism.
Adam Smith: Seen as the 'father of modern economics', centered his work on self-interest as a normal rational human behavior, often leading to self-defeating actions. However, in a properly functioning economy, he believed it would be directed towards public good.
Game Theory: A branch of mathematics where self-interest plays an important role.
Distinguishing 'Intelligent Self-Interest': Logicians differentiate this from ordinary selfishness and egotism. Intelligent self-interest involves:
Looking for the 'bigger picture'.
Seeing alignment between one's own interests and others'.
Willingness to sacrifice short-term benefits for longer-term ones.
Recognizing that some personal benefits are not worth pursuing.
How Self-Interest Impedes Good Reasoning:
Occurs particularly when people have a strong emotional or economic stake in something under perceived threat.
Leads to passion and emotion, clouding judgment.
If one secretly desires something to be true for personal gain without sufficient evidence, it can lead to inadvertently misinterpreting evidence, discounting contradictory evidence, or inventing weak rationalizations, resulting in faulty understanding and bad decisions.
Self-Interest and Competitive Thinking:
Can lead people to believe they must dominate conversations and win every argument, even without direct emotional or economic stakes.
Especially prevalent in competitive societies, turning discussions into battles, which worsens debate quality.
Behaviors include interrupting, obstructionist questions, nit-picking, insulting, aggressive gestures, and loud voices.
Systematic Critical Reason: Is not a weapon; rational inquiry should be collaborative, not competitive.
The goal is to learn and progress; a win for one speaker is a win for everyone.
3.2. Saving Face
Definition: The habit of continuing to argue for one's ideas or beliefs to avoid admitting mistakes or being proven wrong, often due to an interest in having a good reputation and being liked or admired.
Connection to Cognitive Dissonance: Psychologically, this relates to confronting two or more beliefs that cannot simultaneously be true (e.g., 'I am a good person' and 'I caused someone harm').
Psychological Mechanism: Most people are strongly disposed to avoid such contradictions and dislike having muddled thoughts pointed out, as it makes them look foolish. They invent self-interested reasons to reject one belief, with the purpose of restoring self-worth.
Consequences: This is a bad tactic as it may blind individuals to the truth or make it difficult to discover.
Examples:
'Only six people came to the company picnic. I was on the organizing team. But it wasn’t my job to send out the invitations.'
'I got an F on that essay. But I’m getting an A in all my other classes. Clearly, the professor doesn’t know what he’s doing.'
'Jim has been my best friend for ten years and he’s always been nice to me, so I just can’t believe he is the one who stole the old man’s wallet. You must be mistaken.'
'Sally has been my best friend for ten years. But tonight she stole my wallet. I guess she was a bad person all along, and she just tricked me into thinking she was a good person.'
3.3. Peer Pressure
Definition: The psychological pressure exerted by communities or social groups on members to accept prevalent ideas, practices, and beliefs that form part of the group's identity.
Forms of Pressure:
Subtle: An odd look or cold shoulder for non-conformity.
Overt: Exclusion, being shut out of decision-making, malicious gossip, or even threats of violence.
Propaganda: Use of fake social media accounts ('sock puppet' or 'troll' accounts) to create an illusion of widespread support for a message, amplifying its influence (e.g., campaign against Star Wars: The Last Jedi in ).
Consequences: These techniques encourage individuals to keep dissenting views to themselves or change them to fit in with the group.
Critical Point: The number of people who believe an idea has no bearing on its validity or quality.
Problem: Accepting an idea or worldview only because it is favored by one's group, and for no other reason, is a bad habit.
3.4. Stereotyping and Prejudice
Stereotyping:
Definition: The assumption that all members of an 'other' social group share certain qualities or behavioral traits, often based on little to no real evidence. (A fixed idea-may be positive or negative)
Formation: Can arise from characterizations in entertainment media or limited encounters with one or two members of a group.
Flawed Basis: The 'sample size' from personal encounters is almost always too small, leading to a hasty generalization fallacy. Stereotypes can even form without any evidence, simply based on intellectual environment.
Impact: Treats people as representatives of a type, preventing recognition of individuals' distinct qualities and hindering the discovery of truth about individuals and groups.
Prejudice:
Definition: Hostile or harmful attitude/judgement/feelings about the merit or worth of people in a group, assigned based on stereotypical assumptions.(negative)
Group Dynamics: Often arises when a group pressures its members to believe it is superior to others, leading to negative views of rival groups' ideas and worldviews.
Harmful Behaviors: Leads to mistreatment (racist, sexist, classist, ableist, religiously hateful behavior).
Qualities Assigned: Can range in intensity from minor bad qualities (foolishness, uncleanliness) to inciting strong hate/fear, attributing negative traits (emotional instability, criminal tendencies, animalistic features, disease), denying full humanity, or alleging conspiratorial agendas.
Overall Impact: Stereotyping and prejudice consistently prevent people from seeing others and situations as they truly are.
Persistence of Prejudice: Main reason is peer pressure; prejudiced remarks are often encouraged and rewarded within prejudiced groups through smiles, laughter, welcoming gestures, and approving words. This perpetuates prejudice by discouraging independent thought.
3.5. Excessive Skepticism
Definition: The belief that nothing can be truly known unless one can be absolutely certain and beyond any possible doubt. This level of skepticism is almost always excessive, except perhaps for Socrates.
Origins in Risk Assessment: Tends to arise when estimating risks. The excessively skeptical person magnifies potential risks, becoming unwilling to act unless absolute safety and certainty are guaranteed, or because something 'has never been tried before'.
Example: The Moon landings () involved high uncertainty and risk (e.g., initial fears that landers would sink into lunar sand 'seas'). Excessive skeptics weigh risks too heavily, leading to inaction and potentially preventing others from acting.
Theoretical Manifestations: Can appear in purely theoretical matters like doubting external reality (e.g., Descartes' 'evil genius', computer-generated virtual reality, infinite regress of 'How do you know?').
Postmodern Philosophy Connection: Some postmodern philosophies argue that complete knowledge of complex situations (politics, economics, culture) is impossible, or that all truths are interpretive and contextual, making certainty impossible, or even that truths do not exist, only interpretations.
Danger to Truth: Disinformation creators exploit radical skepticism about truth to make repugnant ideologies seem no worse than other worldviews.
Key Principle: Beliefs need only be beyond reasonable doubt, not all possible doubt.
Rule of Thumb: Doubt based on speculation without evidence is not reasonable doubt. Assessing skeptical claims requires considering the probability of alternative explanations, not just their possibility. If an alternative is possible but highly unlikely with little evidence, it's not a strong basis for skepticism.
Example: Dreaming of marrying one's worst enemy in a parallel universe is possible but lacks evidence, so it's not a reasonable explanation for the dream.
3.6. Intellectual Laziness
Definition: The habit of 'giving up too soon' or deliberately avoiding significant questions.
Manifestations: Statements like 'Thinking that way is too confusing,' 'your questions drive me crazy,' or 'these questions cannot be answered, you just have to accept it.' It also includes superficial answers, such as witty quotations from movies or songs, as complete responses to philosophical questions.
Defense of Laziness: Some individuals exert effort to construct complex arguments to defend their intellectual idleness, claiming that intellectually or scientifically minded people 'can’t handle the mystery of things' or seek to 'take away the beauty and the magic of the world.'
Willed Ignorance (a variation of intellectual laziness):
Definition: Actively preventing oneself from answering difficult questions or acknowledging relevant facts.
Mechanism: Preferring to live in a 'bubble' where worldview challenges don't appear. While preserving core values is important, deliberately ignoring facts that challenge a worldview leads to poor decisions.
Beliefs about Truth: Can involve believing some questions are unanswerable or forbidden, or preventing acknowledgment of facts that contradict one's worldview.
Similarity to Cognitive Dissonance: The intellectually lazy person suspects an inconvenient truth exists and takes active steps to avoid confronting it. This is not pure laziness, but has the same effect.
Relativism and 'Truth': Some argue that 'Truth' (with a capital T) about ultimate things (God, justice, knowledge, reality) does not exist, often appealing to relativism. While this thinking may involve effort, its function is often to justify a refusal to think deeply about important matters. The intellectually lazy or willfully ignorant person avoids the work needed to discover ultimate truths.
Consequences: Despite the effort sometimes involved in maintaining it, intellectual laziness can cause significant trouble later, making one susceptible to manipulation and deception, and leading to paralysis in situations requiring decisions.
3.7. Using 'Deepities'
Definition: A deepity is a statement that sounds deep and meaningful, but when you look closer, doesn’t actually say much at all. People often use deepities to seem smart or thoughtful without really thinking deeply. It’s like a shortcut to sounding wise without doing the hard work of real thinking—kind of like intellectual laziness.
Origin: Term coined by American philosopher Daniel Dennett, attributed to his friend's daughter.
Dennett's Definition: A deepity is a proposition that seems both important and true—and profound—but achieves this effect by being ambiguous. On one reading it is manifestly false, but it would be earth-shaking if it were true; on the other reading it is true but trivial.( in other words: he explained it like this: A deepity is a statement that sounds profound, but does so by being ambiguous (unclear or vague). -On one level, it sounds impressive but is false. - On another level, it’s true but very obvious or boring.
Examples:
'Children are the future.'
True but trivial reading: Children will grow up and live in the future..
False but earth-shaking reading: Only children will shape the entire future of humanity.
'It is what it is.'
True but trivial reading: Something exists as it is.
False but earth-shaking reading: Nothing can ever be changed, no matter what.
'Love is a word.'
True but trivial reading: The four letters L-O-V-E spell out the word ‘love’.
False but earth-shaking reading: Love (the human phenomenon) is no more or less important than other phenomena which can also be represented with words: ‘Friendship’, ‘cruelty’, ‘traffic ticket’, ‘fnord’. (In other words: Love is just a label, no more important than any other word.)
'Beauty is in the eye of the beholder.'
True but trivial reading: Beauty is an experience of the physical senses.
False but earth-shaking reading: Beauty is only a matter of personal-belief relativism, and so cannot be discussed or reasoned about with others.
What Makes a Deepity? - A deepity mixes a shallow truth with a misleading, impressive-sounding falsehood. This trick makes the false part sound wise. Unlike poems or complex paradoxes that genuinely explore big ideas, deepities just pretend to be deep—without doing the real work of deep thinking.
3.8. Bullshit
Technical Meaning: In logic, 'bullshit' has a specific technical meaning, as discussed by American philosopher Harry Frankfurt (in his essay 'On Bullshit' and subsequent book).
Not Just Lying: It is not merely lying, but rather 'blustering, pontificating, or gabbing on some topic which you know nothing about'. The result is 'hokum and hot air', even if some statements are incidentally true.
Frankfurt's Distinction: 'What is wrong with a counterfeit is not what it is like, but how it was made.' A liar crafts statements relative to what they believe to be true, hiding facts or intentions. A bullshitter, however, does not care about the truth-values of their statements.
Frankfurt: "Since bullshit need not be false, it differs from lies in its misrepresentational intent…The fact about himself that the bullshitter hides, on the other hand, is that the truth-values of his statements are of no central interest to him."
Core Nature: Bullshit issues from someone who 'neither knows, nor cares, what the truth might be'.
Circumstances of Appearance:
When people feel obliged to talk about things they know nothing about (e.g., societal pressure to have opinions on everything).
In environments where relativism and excessive skepticism are widely accepted, leading to the belief that 'no one can be sure of anything, so you may as well say whatever you want'.
Can also be uttered by someone who enjoys lying for fun, keeping people on edge, or being the center of attention.
Consequences: While it can be fun to utter, it is not enlightening for speakers or listeners.
Contrast with Socratic Wisdom: Bullshit is seen as the opposite of Socratic Wisdom; it is the utterance of a person unable or unwilling to say 'I don’t know.'
3.9. Relativism
Context in Philosophy: Philosophical arguments often involve debates with opposing positions (e.g., morality of the death penalty). Philosophers aim to establish whether claims are true or false, treating them like facts.
Beginner's Discomfort: Novice philosophers may struggle with treating moral, epistemic, or aesthetic claims as objectively right or wrong, as arguments for both sides can be compelling. This can lead to the impression that 'both sides are right', which is problematic.
Contradiction: It is a contradiction to assert that a proposition, such as the moral permissibility of the death penalty, both is and is not true. Nuance is required to specify conditions under which such claims might hold.
Definition of Relativism: The view that a claim is only true or false relative to some other condition. Two common varieties are:
Subjective Relativism (Personal Belief Relativism):
Claim: Truth depends on what an individual believes ('truth is in the eye of the beholder'). Something is true if, and only if, someone believes it to be true, and then it is true for that person (and perhaps only for them).
In Ethics: An action is morally right if the person performing it believes it to be morally right; nothing else makes an action right or wrong except the individual's judgment.
Problems: Makes beliefs exempt from justification or examination. While challenging others' beliefs might seem arrogant, critically examining what one believes and why is fundamental to philosophy. Simply stating 'Alice believes X is okay, so X is right for her' ignores whether Alice has examined her beliefs or if they are based on false information. Investigating beliefs helps achieve consistency and conscientiousness in ethical choices.
Cultural Relativism:
Claim: Something is true, or right, etc., because it is generally believed to be so by a particular culture or society, and then it is true, or right, etc., for that society.
Potential Benefit: Can encourage cultural sensitivity and a critical look at one's own culture by revealing how concepts deemed natural (e.g., gender binary) are non-universal and contingent.
Problems:
Disagreement within a Culture: It does not allow for internal dissent. If a culture permits the death penalty, a member of that culture cannot morally argue against it by this logic, as the practice is already deemed morally acceptable.
Explaining Moral Progress: It struggles to explain moral change or progress. For instance, it would suggest slavery became immoral only after its abolition, rather than people realizing it was inherently wrong.
Conflating Moral vs. Social Norms: Differentiates between moral issues and social norms/etiquette. While cultural relativism might apply to social practices, other factors, such as human rights, can seem to override cultural variations in moral contexts.
Conclusion on Relativism: While problematic, the issues with relativism do not necessarily mean ethical or epistemic truths are always universal and absolute. There is a conceptual space between extreme individual relativism and accepting a single general moral principle. Being open to other cultures' beliefs is important but does not mean accepting them without good reasons.
3.10. The Consequences of Bad Habits
Engaging in bad thinking habits can have serious consequences, including:
Increased vulnerability to intimidation, bullying, or manipulation by others.
Decreased ability to advocate for oneself or others in need.
Difficulty in distinguishing between truth and lies.
Fostering dogmatism and closed-mindedness.
Reducing flexibility, creativity, and preparedness for unpredictable changes.
Leading to the justification of moral decisions that needlessly harm people, including oneself.
Prompting the suppression or ignoring of reliable or important contradictory evidence.
Provoking confusion or anger when confronted with reasons indicating problematic or faulty beliefs.
Preventing serious philosophical thinking about life's most important problems.
Hindering personal growth, maturity, and self-awareness.
3.11. Curiosity
Definition: As an intellectual habit, curiosity is the desire for knowledge. An intellectually curious person is not satisfied with common explanations and seeks to discover more about what is new, strange, or interesting.
Approach to the Unknown: When encountering something different, unusual, unexpected, or even weird and scary, a curious person confronts it directly, makes an honest attempt to investigate, and is not content to let it remain mysterious. Philosophers and scientists strive to understand things as completely as possible, thereby rendering them less mysterious.
Benefits: Intellectual curiosity is crucial for good reasoning and prevents closed-minded dogmatism. It leads to:
Discovery.
Invention.
Expanded awareness of the world and the self.
Sometimes beauty, sometimes power.
Most importantly, it generates and depends on a profound sense of wonder.
Challenging Misconceptions: The notion that rationality limits experiences, kills creativity, or diminishes imagination is incorrect. Those who hold such views may have limited their own experiences by excluding a powerful and successful way of knowing the world. Such claims might also be used to control others by discouraging them from asking questions.
3.12. Self-Awareness, and Socratic Wisdom
Self-Awareness:
Meaning: Derived from the ancient Greek phrase 'γνϖθι σεατον' (know yourself), inscribed above the Oracle of Delphi. It involves understanding one's own presuppositions, desires, biases, worldviews, habits, faults, powers, and talents, as well as the fundamental aspects of being a thinking human being.
Challenge: Self-awareness can be difficult to achieve, often revealing itself only when one's worldview is challenged by others.
Importance: Essential for making sound decisions and resisting manipulation.
Socratic Wisdom:
Meaning: The willingness to acknowledge what one does not know. It is a helpful exercise in cultivating mature self-awareness.
Components: Knowing the limits of one's knowledge is a significant part of self-identity. It requires courage because admitting ignorance can be embarrassing.
Benefits: A healthy sense of one's own ignorance, combined with curiosity, can lead to a fulfilling life of intellectual discovery.
3.13. Physical Health
Connection to Thinking: Taking care of physical health is a crucial good thinking habit. Being unwell, sleep-deprived, stressed, or physically uncomfortable significantly hinders one's ability to observe, understand, and reason clearly about situations.
Components of Good Health for Thinking:
Sufficient exercise.
Eating healthy, real food and avoiding junk food.
Regular bathing.
Adequate sleep.
Caring for mental health, often achieved through daily restful leisure activities.
Scientific Evidence (Japanese Study):
Psychologists in Japan found that people who spent minutes gazing at forest scenery produced less salivary cortisol (a stress hormone).
Walking in natural settings also reduced high blood pressure and heart rate fluctuations.
These findings led to 'forest therapy' programs in Japanese municipalities for stressed factory workers.
Rest vs. Stimulation: High-stimulation activities (video games, action films, intense sports) are fun but not restful. Good critical thinking requires calm, peace, and quiet.
Recommendation: Dedicate at least minutes daily to genuinely relaxing activities such as walking in a forest, meditating, reading, or cooking/eating a proper meal, without multitasking. Addressing physical discomfort (e.g., showering, healthy dinner, walk, good sleep) can significantly improve the ability to deal with frustrating problems.
3.14. Courage
Necessity of Courage in Thinking:
Thinking can lead to unwelcome conclusions by oneself, friends, associates, or figures of authority (boss, teacher, priest, family, government).
Expressing such thoughts can entail risks: job loss, ostracism, criticism, arrest, imprisonment, or even death, even in countries with freedom of speech.
Courageous Thinking Defined: Thinking and expressing dangerous thoughts anyway, without fear. It involves committing to what is rationally judged as the best conclusion, regardless of personal preference or the approval of others. This is harder than it sounds due to social (desire for acceptance, love, inclusion) and institutional (laws, corporate policies) pressures to remain silent.
Personal Courage: Required when questions challenge fundamental parts of one's worldview:
'What if there are no gods?'
'What if there is no objective moral right or wrong?'
'What if a popular or charismatic person is lying?'
'Am I participating in or benefiting from something unjust or evil at my workplace?'
'What if life has no purpose or meaning?'
Taking such questions seriously can lead to self-doubt, despair, and life changes. Even posing them can cause trouble with friends and family due to social pressures.
Public or Political Courage: Required when challenging social arrangements:
Simple examples: Supporting a different sports team than one's home city or friends.
Complex/dangerous examples: Opposing policies of a large corporation or criticizing entities with political power (government, church leaders), risking job loss, arrest, shaming, or dismissal.
Voltaire: "It is dangerous to be right in matters on which the established authority is wrong."
Parrhesia (Classical Greek):
Meaning: 'Bold speech'. A parrhesiastes is a person who makes such a bold statement.
Two Qualities: The speaker must incur some personal risk from social or political forces, and the speaker's words must be true (not merely for controversy).
Modern Equivalent: 'Whistle-blowers'—individuals who expose moral wrongdoing in workplaces, governments, or other groups. They often face severe consequences like harassment, defamation, job loss, lawsuits, vandalism, and death threats.
Core of Courageous Thinking: Prioritizing truth over personal interests (and sometimes personal safety). It also signifies being an agent for necessary change.
3.15. Healthy Skepticism
Definition: A general unwillingness to accept things at face value or as they are presented by others, regardless of who those others are. It means not jumping to conclusions.
Distinction from Excessive Skepticism: Unlike excessive skepticism, healthy skepticism does not require doubting absolutely everything or trusting no one. Instead, it involves investigating many possibilities before settling on the best available explanation.
Basis for Trust/Doubt: Healthy skepticism is willing to trust but requires a prima facie reason for trust or doubt. A 'prima facie' reason is evidence that appears to show something to be true 'on the face' or 'at first glance', before deep investigation.(In simple terms, it means something appears to be true based on the first evidence you see, unless further information proves otherwise.)
Perceptual Intelligence: 'Gut instincts' or hunches are described as 'perceptual intelligence'. This is an intellectual exercise where the unconscious mind seeks patterns, compares them to past events, recalls what followed, and reports findings as feelings. While a good starting point for investigation, prima facie conclusions can sometimes be misleading or wrong upon closer inspection.
Synonym: Healthy skepticism is also known as reasonable doubt.
3.16. Autonomy
Definition: To think autonomously means to think for yourself and not permit other people to do your thinking for you.
Independence: Autonomous thinkers do not blindly accept information or opinions from parents, friends, role models, governments, newspaper columnists, or any other influencing figure.
Sole Obligation: The only obligation in thinking (if it is an obligation) is to think clearly, consistently, rationally, and, when necessary, courageously.
Outcome: After autonomous and critical self-reflection, one may conclude that their worldview aligns significantly with that of their influences. This is acceptable, as the crucial point is that the worldview is now one's own, derived from personal inquiry, not merely inherited.
3.17. Simplicity
Principle: When confronted with two or more explanations of roughly equal merit for a phenomenon, the simpler explanation should be preferred.
Connection to Complexity: While reason often uncovers layers of complexity behind appearances, this principle applies when explanations are comparably good.
Ockham's Razor:
Nomenclature: This principle is also known as Ockham's Razor.
Origin: First articulated by Brother William of Ockham, a Franciscan monk, who lived from to .
Latin Formulation: "Entia non sunt multiplicanda sine necessitate."
English Translation: "No unnecessary repetition of identicals." (In other words, don’t make something more complicated than it needs to be.)
Illustrative Example: It is simpler to believe there is one table in a position than identical tables occupying the exact same space and time. (In other words: You could imagine 23 identical tables in the same place… but it’s way simpler (and makes more sense) to say there’s just one table)
Ockham's Original Context: Theological; he used it to argue that monotheism (one infinite God) is a simpler assumption than polytheism (a dozen or more gods). (In other words: Ockham used this to argue that it’s simpler to believe in one God than in many.)
Applications: The idea has been broadly applied across various fields, including scientific theory development and interpretation of art (poetry, film, literature).
Alternative Phrasings: "All other things being equal, the simplest explanation tends to be the truth," and "The best explanation is the one which makes the fewest assumptions."
simplicity in reasoning means: if two answers work, go with the one that assumes less and explains more directly.
3.18. Patience
Nature of Critical Thinking: Good philosophical thinking requires time, and progress in critical thinking is often very slow. It is not an efficient process measured by maximizing inputs and outputs in the shortest time, unlike manufacturing.
Reason for Patience: Good critical thinking necessitates uncovering subtleties—small differences or delicate details that gain importance upon contemplation—which are initially hard to discern and easy to overlook.
Example 1: The diverse ways the word 'Yes' can be uttered, each conveying a different meaning, highlights the importance of precision.
Example 2: In a Piet Mondrian painting, such as 'Composition with Yellow, Blue, and Red,' the white squares framed by black lines are not all the exact same shade of white. This subtlety is unnoticeable in a quick glance or on a low-resolution screen. It is the role of reason to uncover such subtleties and for them to be examined directly.
Conclusion: The search for these subtleties cannot be rushed, emphasizing the critical value of patience in intellectual endeavors.
3.19. Consistency
Worldview Definition Revisit: A worldview is defined as 'the sum of a set of related answers to the most important questions in life.'
Importance: A worldview must be consistent, meaning its answers to life's big questions should cohere well and not overtly contradict each other.
Consequences of Inconsistency: Inconsistent thinking often leads to mistakes, generates cognitive dissonance (an uncomfortable feeling), and can be embarrassing.
Benefits of Consistency: While consistent thinking does not eliminate mistakes, it makes identifying and correcting them much easier.
Broader Meaning: Consistency also extends to staying on topic, adhering to facts, and following an argument through to its conclusion. While random exploration of ideas can be enjoyable, maintaining focus becomes critical as problems become more serious.
Avoiding Fallacies: Digressing too far from the topic can lead to logical fallacies such as Straw Man and Red Herring.
3.20. Open-ness and Open-mindedness
Definition: Involves listening to others, taking their views seriously, and treating their ideas respectfully, even while critically examining them. It also means presenting one's own views open to critical scrutiny, without resorting to fear or force.
Principle of Charity: In philosophy, this principle requires speakers and listeners to interpret and understand each other's ideas in the best possible light. Listeners should assume other speakers are rational (unless strong evidence suggests otherwise) and that their statements are rational, even if not immediately obvious. This is a professional courtesy among philosophers.
What Open-mindedness Isn't:
It does not mean accepting everyone's ideas as equally valid.
It is not the same as assuming all things are true.
It is not the same as relativism.
Core Behavior: An open-minded person seeks the best explanation for things, regardless of personal preference or whether it fits their existing worldview. They are open to the possibility of being wrong, having a faulty worldview, or needing to change their thinking about important matters, but only for good reasons.
Benefits:
Ensures proper understanding of other people's views, preventing the Straw Man fallacy.
Facilitates finding common ground, which is essential for quelling conflict.
Ensures that rejected ideas are rejected for the correct reasons.
Helps prevent intellectual or ideological differences from escalating into personal grudges.
Example (Rain at a Picnic):
If rain starts, one person might suggest ghosts, another air pressure changes.
The open-minded person does not necessarily accept both as equally possible and stop there. Instead, they seek evidence for each explanation. If evidence is lacking for one, they reject it and continue the search for evidence for another.
In contrast, a closed-minded person chooses the explanation they prefer, regardless of evidence, and refuses to consider alternatives. Closed-mindedness is indicative of a 'value program' (a worldview resistant to change).
A general rule: the closed-minded person is often quick to label others as closed-minded, especially when their own ideas are criticized.
Conclusion: Open-mindedness helps in arriving at good explanations. It does not mean all explanations are equally valid, nor does it require placing unlikely explanations on par with those supported by verifiable evidence or logical structure. It means giving a fair examination to every explanation that appears sound, at least initially, irrespective of its origin or originator.
3.21. Asking for Help
Balancing Independence and Collaboration: While good thinking often requires independence and autonomy, and problems can arise from excessive external influence (e.g., peer pressure), seeking help from respected, knowledgeable others can be beneficial.
Value of Advice and Guidance: It is acceptable to ask trusted individuals for advice and guidance while still making one's own decisions.
Benefits of External Perspectives: Hearing different points of view or discussing issues with someone who can be both critical and appreciative can be helpful. The shared wisdom and experience of friends, elders, and associates often lead to new perspectives and better decisions.
Others can suggest possibilities not previously considered.
They might possess relevant knowledge that broadens one's understanding.
Their past experiences with similar problems can clarify one's own situation.
Philosophical Support (Seneca):
Roman philosopher Seneca illustrates this: "Skilled wrestlers are kept up to the mark by practice; a musician is stirred to action by one of equal proficiency. The wise man also needs to have his virtues kept in action; and as he prompts himself to do things, so he is prompted by another wise man." ().
Selection and Trust: The effectiveness of asking for help depends on whom one chooses, how much trust is placed in them, and the frequency of seeking their input.
Conclusion: Complex problems need not always be tackled alone; a habit of asking for help from peers, elders, and colleagues can clarify thinking and lead to improved solutions.
3.22. Summary Remarks
Bad Habits: The described bad thinking habits do not necessarily or inevitably lead to unsound arguments, false beliefs, or faulty worldviews. They are distinct from fallacies. An argument can be strong even if its conclusion aligns with the speaker's self-interest or cultural presuppositions. However, bad habits signal that thinking is likely not fully clear, critical, or rational, suggesting a premature end to the search for truth.
Good Habits: Similarly, good thinking habits, by themselves, do not guarantee perfect rationality. Nevertheless, they significantly increase the likelihood of one's thinking being rational.
Extra notes:
People think in different ways, and some ways of thinking actually make it harder to find the truth, solve problems, or understand each other. Other ways make those things easier. I’ll call these “good and bad thinking habits.”
I use the word “habits” instead of “rules” because they don’t always apply perfectly. Sometimes a good habit might not work well, or a bad habit might help—but those cases are rare.
You can't have real doubt just by guessing without proof.
To seriously question something, you need to think about how likely other explanations are — not just whether they’re possible.
If another explanation is barely possible and has no real evidence, then it’s not a good reason to doubt the original claim.Don’t let doubt be based just on guesses. If you're unsure about something, there should be real reasons or evidence behind your doubt—not just the idea that “maybe something else happened.” Just because another explanation is possibledoesn’t mean it’s likely. If that other explanation doesn’t have much evidence and isn’t very probable, it’s not a good reason to be skeptical.
Situation:
A man is on trial for robbery. His fingerprints were found at the scene, he was caught on security camera, and stolen items were found in his car.Unreasonable doubt (bad skepticism):
“Maybe someone who looks just like him committed the crime and planted the items in his car.”Why it's unreasonable:
That’s just a possibility, but there’s no evidence for it, and it’s very unlikely. So it doesn’t create a reasonable doubt.🧪 Example 2: Science and Vaccines
Situation:
Many scientific studies show that a vaccine is safe and effective.Unreasonable doubt:
“Maybe the studies were secretly faked, and the vaccine is actually harmful.”Why it's unreasonable:
There’s no real evidence for this idea, and it’s highly unlikely. So it’s not a solid reason to reject the science.🚗 Example 3: Car Won’t Start
Situation:
Your car won’t start, and the battery is old. A mechanic checks and confirms it’s dead.Unreasonable doubt:
“Maybe aliens zapped my car and drained the power.”Why it's unreasonable:
It’s technically possible, but there’s zero evidence for it, and it’s extremely improbable. A dead battery is the far more reasonable explanation.