Fallacies of Weak Induction

Fallacies of Weak Induction

  • Definition
    • Category of informal fallacies in which the logical link (inference) between premises and conclusion is weak rather than entirely absent.
    • Differs from fallacies of relevance: premises do supply some evidence, but far less than required for a rational believer to accept the conclusion.
    • Typical pattern: premises contain a ‘shred’ of support that seems plausible at first glance, yet scrutiny shows it insufficient.
    • Practical significance: Recognizing weak-induction fallacies helps evaluate polls, news items, advertisements, political speeches, and courtroom arguments where incomplete evidence is routinely offered.
    • Ethical implication: Relying on such arguments may lead to unsound policies, wrongful convictions, or the spread of misinformation.

Appeal to Unqualified Authority (Argumentum ad Verecundiam)

  • Core Idea
    • An argument cites an authority or witness who lacks the appropriate credibility for the point at issue.
  • Common credibility failures
    1. Lack of relevant expertise or credentials.
    2. Bias, prejudice, or financial/political motive that undermines neutrality.
    3. Intentional deceit or desire to spread misinformation.
    4. Inability to perceive/remember accurately (e.g., poor eyesight, bad memory, intoxication).
  • Classical structure
    • Premise: Authority A states that proposition P is true.
    • Conclusion: Therefore, P is true.
    • Hidden (illicit) premise: Authority A is reliable on this topic.
  • Illustrative examples
    • Medical doctor pronounces on nuclear fusion: Dr. Bradshaw (expert in medicine, not physics) claims muonic atoms will create room-temperature fusion ➔ insufficient authority.
    • Tobacco executive testifies cigarettes are non-addictive: James W. Johnston (financially motivated, biased) ➔ trustworthiness compromised.
    • Nearly blind witness claims to have seen a stabbing from 100 yards at twilight ➔ perceptual competence absent, testimony unreliable.
  • Why fallacious?
    • Expertise is domain-specific; lack of domain overlap eliminates evidential weight.
    • Bias & motives skew reports toward self-interest, eroding impartiality.
    • Faulty perception/memory means empirical premises become doubtful.
  • Real-world relevance: Media often quote celebrities on scientific matters (vaccines, climate) producing persuasive but unsound public opinion.

Appeal to Ignorance (Argumentum ad Ignorantiam)

  • Core Idea
    • From the fact that something has not been proven true (or false), the arguer concludes it is false (or true).
  • Typical form
    • Premise: No one has proved proposition P (or \neg P).
    • Conclusion: Therefore, P is false (or true).
  • Key diagnostic features
    1. Issue is inherently difficult or (currently) impossible to test.
    2. The premises produce zero positive evidence regarding truth-value.
    3. Cited investigators lack relevant expertise or remain unnamed.
  • Example (fallacious)
    • “Centuries of attempts to verify astrology have failed; hence astrology is nonsense.”
      • Premise lacks positive evidence; the failure might be due to poor methods, not the idea’s falsity.
  • Legitimate non-fallacious variant
    • When qualified experts in a relevant field conduct thorough searches yet still fail to find evidence, non-existence becomes the most reasonable conclusion.
    • Example: Decades of scientific experiments failed to detect the luminiferous aether ➔ reasonable to infer non-existence.
  • Practical caveat: Science often moves from ‘absence of evidence’ to ‘evidence of absence’ only when detection methods are adequately sensitive and the search exhaustive.

Hasty Generalization (Converse Accident)

  • Core Idea
    • Drawing a broad conclusion about an entire group/population from an unrepresentative sample.
  • Two main sample defects
    1. Sample size too small (insufficient n).
    2. Sample biased (not randomly or objectively selected), even if numerically large.
  • Example 1: Small sample
    • “Money managers are all thieves; look at Bernie Madoff, Robert Stanford, Raj Rajaratnam.” ➔ Three notorious cases do not justify condemning every manager.
  • Example 2: Large but biased sample
    • Survey of 100{,}000 voters in conservative Orange County shows 68\% support for the Republican candidate; author concludes the Republican will win statewide. The county’s ideological skew makes the sample unrepresentative.
  • Statistical perspective
    • Representative sampling requires both adequate size (reduce margin of error \pm\varepsilon) and randomness (avoid systematic error). Ignoring these conditions invalidates inductive leap.
  • Ethical/political impact: Misleading polls may shape voter expectations, influence donations, or depress turnout.

False Cause (Non Causa Pro Causa family)

  • Overall theme
    • Argument mistakenly assumes a causal linkage that is nonexistent, reversed, or oversimplified.
  • Three main varieties
    1. Post hoc ergo propter hoc (“after this, therefore because of this”)
      • Confuses temporal succession with causation.
      • Example: Cheerleaders wear blue ribbons ➔ team loses ➔ conclude ribbons cause losses.
    2. Non causa pro causa (“not the cause for the cause” / causal reversal)
      • Picks a factor that correlates with the effect but is actually an effect—or unrelated.
      • Example: “Executives earn >100{,}000, so raising Ferguson’s pay to 100{,}000 will make him successful.” Salary likely an effect, not the cause, of executive success.
    3. Oversimplified cause
      • Complex phenomenon explained by citing only one among many causal factors.
      • Example: Declining school quality blamed solely on teachers; ignores funding, class size, curriculum, socio-economic factors, etc.
  • Analytical tools
    • Causal inference requires controlling for confounding variables, temporal precedence, and plausibility. Methods: randomized controlled trials, statistical regressions.
  • Practical danger: Faulty causal attributions can drive ineffective policies (e.g., raising pay without training) or scapegoat groups unfairly (teachers).

Slippery Slope

  • Core Idea
    • Arguer claims that a seemingly innocuous first step will inevitably trigger a chain of events culminating in extreme (usually catastrophic) outcome.
  • Logical structure
    1. If action A occurs, then event B will follow.
    2. B leads to C, C to D … eventually disaster Z.
    3. Therefore, avoid action A to prevent Z.
  • Fallacious when
    • Links in the causal chain are not substantiated; probability of transition between each step is low.
  • Example
    • Failure to outlaw pornography ➔ rise in sex crimes ➔ moral decay ➔ general crime wave ➔ collapse of civilization. Each link is speculative, unsupported.
  • Evaluation guideline: Require independent evidence for each causal connection; consider mechanisms, statistical data, possibility of intervention points.
  • Ethical concern: Slippery slope rhetoric can stifle reforms (e.g., same-sex marriage, drug decriminalization) by exaggerating remote risks.

Weak Analogy (Faulty Analogy)

  • Core Idea
    • Reasoning by comparing two entities that share some properties p,q,r, but inferring a further similarity z without sufficient justification.
  • Standard form
    1. Entity A has features p,q,r,z.
    2. Entity B has features p,q,r.
    3. Therefore, B has feature z.
  • Diagnostic question
    • Are p,q,r causally or systematically connected to z? If not, analogy is weak.
  • Example
    • Comparing car breakdown and heart attack: A passing mechanic (like any driver) has no obligation to stop; therefore, a passing physician has no obligation to assist a heart-attack victim. Analogy fails because professional ethical duties of physicians (Hippocratic tradition, legal Good Samaritan statutes) link medical expertise to emergency aid, unlike mechanical skill.
  • Strengthening an analogy requires demonstrating relevant similarities (shared underlying principles) and disarming disanalogies.

Study & Practice Tips

  • When encountering an inductive argument, identify:
    1. Type of inference (authority, sample, cause, analogy, ignorance).
    2. Implicit assumptions (credibility, representativeness, causal link, chain plausibility, relevance of similarities).
    3. Counter-examples or missing evidence.
  • Actively challenge each premise: Who said it, how big is the sample, what else could cause the effect, are intermediate steps warranted?
  • Ethical reflection: Critical reasoning protects public good—prevents policy errors, judicial miscarriages, and manipulative persuasion.

In-Class Exercise (recap)

  • Students paired up for 5-minute practice identifying the above fallacies in assigned problems.
  • Group discussion followed to reinforce diagnostic skills and clarify ambiguities.