CGSC 110 - 14. Reasoning & Rationality [TF Guest lecture]

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/11

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

12 Terms

1
New cards

If A, then B. A. Therefore, B.

What is modus ponens?

2
New cards

If A, then B. B. Therefore, A.

What Affirming the Consequent?

3
New cards

If A, then B. Not B, therefore, not A.

What is modus tollens?

4
New cards

If A, then B. Not A. Therefore, not B.

What is denying the antecedent?

5
New cards

The tendency to only seek information to prove a hypothesis rather than to disprove one.

Define confirmation bias?

6
New cards

A D 3 7

You have to flip over the A and 7 cards.

Give an example of confirmation bias?

7
New cards

The availability heuristic is a mental shortcut where people assess the likelihood or frequency of an event based on how easily examples of that event come to mind. In other words, if you can quickly think of instances of something happening, you may overestimate how common or likely that event is. It relies on availability — how easily you can recall or imagine an event — rather than on more systematic, logical reasoning.

  • Probabilities judge solely on how easily it comes to mind

  • Examples: 

    • Retrievability of instances

      • Plane vs car

    • Imaginability 

      • Sharks vs hippos

Here’s how the availability heuristic works:

1. Ease of Recall: The easier it is for you to think of an example of something happening, the more likely you are to believe it happens often or is highly probable.

2. Vividness of Examples: Events that are more dramatic, emotionally charged, or widely reported in the media tend to be more available in your memory. As a result, you might overestimate their frequency or probability because those examples stand out more in your mind.

### Example of the Availability Heuristic:

If you recently watched a news report about airplane crashes, you might begin to overestimate the likelihood of a crash the next time you fly, simply because the examples of crashes are readily available in your memory. Even though statistically, airplane travel is very safe, the vividness and frequency with which you recall these crashes can skew your perception.

### In the scope of a broader concept:

If you want to apply this heuristic to a specific scenario or data set, feel free to provide it, and I can help analyze it in the context of the availability heuristic.

Give examples of availability heuristic in real-life

8
New cards
  • Conjunction fallacy

    • Hospitalized with covid-19 vs hospitalized with covid-19 + immunocompromised

  • The conjunction fallacy is a cognitive bias in which people mistakenly believe that the probability of two events occurring together is greater than the probability of one of the events occurring alone, even when this contradicts basic principles of probability.

What Is the conjunction fallacy?

9
New cards
  • Probabilities judge solely on how representative it is deemed of some category

  • Examples: 

In the scope of representativeness theory, people often judge probabilities or make decisions based on how representative something is of a certain category or prototype, rather than using statistical reasoning or actual probabilities. This bias can lead to systematic errors, as people tend to ignore base rates, rely on intuition, and apply stereotypes, often ignoring important statistical principles. Below are explanations of the examples in the context of representativeness theory:

### 1. Base Rate Neglect

Base rate neglect occurs when people ignore the general frequency (base rate) of an event in the population, instead focusing on how representative the specific case is of a particular category.

- Example: If told that 70% of a population are engineers and 30% are lawyers, and then given a description of a person who is introverted and enjoys math, people might guess the person is more likely to be a lawyer, simply because this description seems more representative of a lawyer stereotype, ignoring the base rates.

### 2. Steve (The "Linda" Problem)

This refers to a famous thought experiment where people are told about a person named Steve, described as shy, introverted, and interested in literature. When asked whether it is more probable that Steve is a librarian or a librarian who is also active in social causes, many people incorrectly assume the latter, as the description seems more representative of that stereotype. This shows how people ignore the base rate (the number of librarians) and focus on how representative the description is.

### 3. Mammograms

People often misinterpret the results of medical tests like mammograms due to representativeness bias. If a person receives a positive test result, they might assume the probability of having breast cancer is high, even though the test's false positive rate and the base rate of cancer in the population may indicate otherwise.

- Example: Given that breast cancer is relatively rare, a positive result may be more likely a false positive than a true diagnosis, but people often judge based on how "representative" the symptoms or test result seem, without considering the base rate of cancer in the population.

### 4. Misconceptions of Chance

People believe that short sequences of random events should reflect the expected long-term probabilities, even though short sequences are often not representative of the overall distribution.

- Example: If a coin is flipped five times and the result is H-T-H-T-H (alternating heads and tails), people may believe the next flip is more likely to be tails, thinking the sequence should "balance out." This misconception ignores the fact that each flip is independent, and the probability of heads or tails remains 50% for each flip, regardless of prior outcomes.

### 5. Casinos and Babies

At casinos, people might believe that after a long sequence of black in a roulette game, red is "due" to come up next, simply because red and black seem to alternate in a way that is representative of a balanced outcome. This is a fallacy, as the outcomes of the spins are independent events, but people’s judgment is influenced by how "representative" the sequence feels.

- Example: Similarly, in the context of babies, parents might feel that after having multiple girls, a boy is "due," even though the chance of having a boy or a girl is independent of past births.

### 6. Insensitivity to Predictability

This refers to people's tendency to overlook statistical data and rely on how representative something feels based on prior experience or stereotypes.

- Example: People might overestimate the ability of a stockbroker or a sports team's performance based on their previous success (how representative they seem of a successful stockbroker or team), ignoring that past performance is not necessarily predictive of future outcomes.

### 7. Choosing Classes

When selecting classes, students might be influenced by how "representative" a class feels in terms of their personal interests or goals, rather than considering the actual difficulty, workload, or how well it fits into their long-term academic plan.

- Example: A student might choose a class on literature because they like reading books, even though the course might be significantly harder than a statistics course, simply because the class seems more representative of their interests.

### 8. Misconceptions of Regression

This is the tendency to misinterpret the statistical concept of regression to the mean, where extreme events (either very good or very bad) are often followed by more moderate ones. People might believe that after a particularly good performance, further success is less likely because it "feels" like a natural decline, ignoring the statistical fact that success tends to be followed by more average outcomes.

- Example: If a sports player has an amazing season, people may believe they are bound to perform worse the following season, simply because they view the previous season as unusually good and "not representative" of the player’s usual performance.

### 9. Hopes and Dreams of Sports Players' Kids

Parents of athletes often overestimate the likelihood that their children will also become successful athletes, based on how "representative" the child’s physical traits or early achievements seem of a professional athlete. This bias ignores the complex and varied factors that contribute to athletic success.

- Example: A parent might be convinced that their child will become a famous soccer player simply because the child has shown early aptitude in the sport, even though the probability is low and success in sports is often highly dependent on factors beyond just early talent.

### 10. Being Nice or Mean to Your Kid

People may treat their children according to how representative their behavior is of certain expectations or stereotypes (e.g., treating a well-behaved child with more leniency or kindness, or reacting negatively to a rebellious child), even though this may not be the best approach to parenting.

- Example: If a child does something "bad," a parent may treat them more harshly because the behavior fits a stereotype of being "bad," even if the child’s overall behavior is typically good. This judgment is based on how "representative" the child's behavior is of the category of a "bad child."

### Conclusion:

The representativeness heuristic leads people to judge the likelihood of events based on how well they seem to match a particular prototype or category, often disregarding important statistical information like base rates or actual probabilities. This can lead to systematic errors in judgment, such as base rate neglect, misconceptions of chance, and overestimating the likelihood of extreme events. It’s a mental shortcut that feels intuitive but can lead to poor decision-making.

What is the Representativeness heuristic?

10
New cards
  • System 1 is fast, automatic, and emotional — good for quick, routine decisions but prone to errors and biases.

  • System 2 is slow, deliberate, and logical — good for careful analysis and problem-solving but requires more cognitive effort and time.

Both systems are essential for navigating the world: System 1 allows for quick responses in familiar situations, while System 2 helps us make thoughtful, logical decisions when necessary.

Explain system 1 and 2

11
New cards

The brain as a limited capacity to process information. There is evolutionary pressure to encode as much information as possible with as neural activity as necessary. = in need of selective attention

  • Evolutionary biology says “Brain very (energetically) expensive”

    • Brain want do less

    • Constraint from below: neuronal firing speed is inverse with how many neurons can fire at a time

  • Resource rational view of heuristics

    • Optimal use of limited resources (brain energy and time resources)

What is the evolutionary pressure towards the brain?

12
New cards

By applying these life hacks, you can counteract cognitive illusions and improve your decision-making:

  • Think in terms of frequencies instead of percentages to avoid misjudging rare events.

  • Recognize the conjunction fallacy and avoid overestimating the probability of combined events.

  • Break down complex medical information into its individual components, and be mindful of base rates.

  • Concretize abstract information into something more tangible or evolutionarily relevant to ground your thinking.

  • Use the Wason Card Task logic by focusing on falsifiability rather than confirming assumptions.

Give five hacks