knowt logo

Introductory quest 

  1. Anchoring bias. People are over-reliant on the first piece of information they hear. In a salary negotiation, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind.

  2. Availability heuristic. People overestimate the importance of information that is available to them. A person might argue that smoking is not unhealthy because they know someone who lived to 100 and smoked three packs a day.

  3. Bandwagon effect. The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink and is reason why meetings are often unproductive.

  4. Blind-spot bias. Failing to recognize your own cognitive biases is a bias in itself. People notice cognitive and motivational biases much more in others than in themselves.

  5. Choice-supportive bias. When you choose something, you tend to feel positive about it, even if that choice has flaws. Like how you think your dog is awesome — even if it bites people every once in a while.

  1. Clustering illusion. This is the tendency to see patterns in random events. It is key to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

  2. Confirmation bias. We tend to listen only to information that confirms our preconceptions — one of the many reasons it's so hard to have an intelligent conversation about climate change.

  3. Conservatism bias. Where people favor prior evidence over new evidence or information that has emerged. People were slow to accept that the Earth was round because they maintained their earlier understanding that the planet was flat.

  4. Information bias. The tendency to seek information when it does not affect action. More information is not always better. With less information, people can often make more accurate predictions.

  5. Ostrich effect. The decision to ignore dangerous or negative information by "burying" one's head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets.

11. Outcome bias. Judging a decision based on the outcome — rather than how exactly the decision was made in the moment. Just because you won a lot in Vegas doesn't mean gambling your money was a smart decision.

12. Overconfidence. Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives. Experts are more prone to this bias than laypeople, since they are more convinced that they are right.

13. Placebo effect. When simply believing that something will have a certain effect on you causes it to have that effect. In medicine, people given fake pills often experience the same physiological effects as people given the real thing.

14. Pro-innovation bias. When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations. Sound familiar, Silicon Valley?

15. Recency. The tendency to weigh the latest information more heavily than older data. Investors often think the market will always look the way it looks today and make unwise decisions.

Detecting Bias in the Media

  • Bias by omission: For every news story that is selected, there are many others that are left out. Do the news stories you see show a balanced view of real life? What are the characteristics they have in common? (e.g., Are they mostly about violence, famous people, wealth?) Do some news sources include items that are ignored by others?

  • Bias by emphasis: What stories are on the front page or "at the top of the hour?" Which stories get the largest headlines, or the first and longest coverage on TV or radio? Consider how this placement influences people's sense of what is important.

  • Bias by use of language: The use of labels such as "terrorist," "revolutionary," or "freedom fighter" can create completely different impressions of the same person or event.

  • Bias in photos: Unflattering pictures can create bad impressions, and partial pictures of scenes can completely change the context of an event.

  • Bias in the source: An article about a cure for cancer written by a drug company is not the same as an article by an independent researcher. Often, private companies, governments, public relations firms, and political groups produce press releases to gain media exposure and to influence the public.

  • Bias by headlines: Some headlines can be deceptive, as their main purpose is to grab attention. Many people read only the headlines, which can create a distorted sense of what is really going on, or turn a non-event into a sensational event.

  • Bias by repetition: The repetition of a particular event or idea can lead people to believe that it is true, very widespread, and much more important than it really is.

  • Bias in numbers and statistics: Statistics need to be interpreted; they are often used to create false impressions. Of the following statements, which statistic would you use to try to convince someone that the death penalty is a good idea? – Almost 30% of those surveyed support the death penalty. – More than 70% of those surveyed are against the death penalty.

3 different disciplines: anthropology, psychology, and sociology

Anthropology: the study of humans

  • Societies

  • Cultures

  • Evolution

  • Ecology

Sociology: the study of how human societies develop and function

  • Functioning of social groups

  • Social Problems

  • Conflicts

  • Social interactions

Psychology: the study of the human mind and human behaviour

  • Personal attitudes, beliefs, and personality

  • Perception and understanding

  • Interpretation of situations

Introductory quest 

  1. Anchoring bias. People are over-reliant on the first piece of information they hear. In a salary negotiation, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind.

  2. Availability heuristic. People overestimate the importance of information that is available to them. A person might argue that smoking is not unhealthy because they know someone who lived to 100 and smoked three packs a day.

  3. Bandwagon effect. The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink and is reason why meetings are often unproductive.

  4. Blind-spot bias. Failing to recognize your own cognitive biases is a bias in itself. People notice cognitive and motivational biases much more in others than in themselves.

  5. Choice-supportive bias. When you choose something, you tend to feel positive about it, even if that choice has flaws. Like how you think your dog is awesome — even if it bites people every once in a while.

  1. Clustering illusion. This is the tendency to see patterns in random events. It is key to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

  2. Confirmation bias. We tend to listen only to information that confirms our preconceptions — one of the many reasons it's so hard to have an intelligent conversation about climate change.

  3. Conservatism bias. Where people favor prior evidence over new evidence or information that has emerged. People were slow to accept that the Earth was round because they maintained their earlier understanding that the planet was flat.

  4. Information bias. The tendency to seek information when it does not affect action. More information is not always better. With less information, people can often make more accurate predictions.

  5. Ostrich effect. The decision to ignore dangerous or negative information by "burying" one's head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets.

11. Outcome bias. Judging a decision based on the outcome — rather than how exactly the decision was made in the moment. Just because you won a lot in Vegas doesn't mean gambling your money was a smart decision.

12. Overconfidence. Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives. Experts are more prone to this bias than laypeople, since they are more convinced that they are right.

13. Placebo effect. When simply believing that something will have a certain effect on you causes it to have that effect. In medicine, people given fake pills often experience the same physiological effects as people given the real thing.

14. Pro-innovation bias. When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations. Sound familiar, Silicon Valley?

15. Recency. The tendency to weigh the latest information more heavily than older data. Investors often think the market will always look the way it looks today and make unwise decisions.

Detecting Bias in the Media

  • Bias by omission: For every news story that is selected, there are many others that are left out. Do the news stories you see show a balanced view of real life? What are the characteristics they have in common? (e.g., Are they mostly about violence, famous people, wealth?) Do some news sources include items that are ignored by others?

  • Bias by emphasis: What stories are on the front page or "at the top of the hour?" Which stories get the largest headlines, or the first and longest coverage on TV or radio? Consider how this placement influences people's sense of what is important.

  • Bias by use of language: The use of labels such as "terrorist," "revolutionary," or "freedom fighter" can create completely different impressions of the same person or event.

  • Bias in photos: Unflattering pictures can create bad impressions, and partial pictures of scenes can completely change the context of an event.

  • Bias in the source: An article about a cure for cancer written by a drug company is not the same as an article by an independent researcher. Often, private companies, governments, public relations firms, and political groups produce press releases to gain media exposure and to influence the public.

  • Bias by headlines: Some headlines can be deceptive, as their main purpose is to grab attention. Many people read only the headlines, which can create a distorted sense of what is really going on, or turn a non-event into a sensational event.

  • Bias by repetition: The repetition of a particular event or idea can lead people to believe that it is true, very widespread, and much more important than it really is.

  • Bias in numbers and statistics: Statistics need to be interpreted; they are often used to create false impressions. Of the following statements, which statistic would you use to try to convince someone that the death penalty is a good idea? – Almost 30% of those surveyed support the death penalty. – More than 70% of those surveyed are against the death penalty.

3 different disciplines: anthropology, psychology, and sociology

Anthropology: the study of humans

  • Societies

  • Cultures

  • Evolution

  • Ecology

Sociology: the study of how human societies develop and function

  • Functioning of social groups

  • Social Problems

  • Conflicts

  • Social interactions

Psychology: the study of the human mind and human behaviour

  • Personal attitudes, beliefs, and personality

  • Perception and understanding

  • Interpretation of situations

robot