Funhouse Mirror Factory: Social Media Distorting Norms
Introduction
- The internet has experienced one of the fastest technological revolutions in history [1].
- Approximately 5 billion people worldwide use social media, spending an average of two and a half hours daily online [2].
- Social media is likened to a funhouse mirror, distorting our collective sense of what is normative [3].
- Online discussions are dominated by a small, vocal, and non-representative minority.
- Only 3% of active accounts are toxic but produce 33% of all content [4].
- 74% of online conflicts start in just 1% of communities [5].
- 0.1% of users share 80% of fake news [6, 7].
- This extreme minority stirs discontent, spreads misinformation, and biases the meta-perceptions of passive users, leading to false polarization and pluralistic ignorance.
- Exposure to extreme content can normalize unhealthy and dangerous behavior.
- Teens exposed to extreme content related to alcohol consumption thought dangerous alcohol consumption was normative [12].
- The paper aims to explain:
- How online environments become saturated with false norms.
- Who is misrepresented online.
- What happens when online norms deviate from offline norms.
- Where people are affected online.
- Why expressions are more extreme online.
- The goal is to provide a framework for understanding and correcting distortions in social perceptions created by social media platforms.
How Norms Become Distorted Online
- Social norms are defined as the “predominant behaviors, attitudes, beliefs, and codes of conduct of a group” [13].
- Norms are tied to people’s social identity, signaling group affiliation and strengthening ties [14].
- New group members quickly learn and follow new norms, which are relatively impervious to reinforcement learning [15, 16].
- Conforming to norms demonstrates commitment to the group [17–19].
- When a norm is ambiguous, people base decisions on the group consensus [20].
- Accurately detecting social norms is critical for social acceptance.
- People form an average representation of a series of exemplars in a group via ensemble encoding [21, 22].
- Ensemble coding is cognitively efficient, encoding a single representation of a set of stimuli rather than memorizing every item [21].
- Socially, ensemble coding allows people to form a single estimation of group emotion or opinion [23, 24].
- People encode social norms from posts and comments in online forums and social media platforms.
Whose Opinions Are Represented Online?
- Ensemble coding can be distorted online due to the structure of normative information.
- False norms emerge because social media is dominated by a small number of extreme people who post their most extreme opinions at a very high volume [25–27].
- Moderate or neutral opinions are practically invisible online.
- Encountering a disproportionate volume of extreme opinions can lead to false perceptions that norms are far more extreme than they actually are.
- Online consumer reviews often reflect extremely positive or negative experiences [28].
- On platforms like Instagram, there is a norm to present oneself as interesting, attractive, and successful [29].
- On LinkedIn, people disproportionately report successes and accomplishments rather than failures [30].
- In online political discussions, the people who post frequently are often the most ideologically extreme [31, 32].
- 97% of political posts from Twitter/X come from just 10% of the most active users [33].
- Most people are ideologically moderate, uninterested in politics, and avoid political discussions offline [34–36].
- In discussions of the Covid-19 vaccine on Twitter, only 0.35% of people were in true echo chambers, yet those users dominated the overall discourse [37].
- An analysis of social media found that a third of low-credibility posts were shared by just 10 accounts [26, 38].
- Moderate opinions are effectively invisible on social media.
- Biased inputs from the online environment can lead to extremely biased outputs when people rely on ensemble coding.
- This is especially problematic for topics like politics, where opinions are invisible, and people are generally hesitant to share their opinions in everyday life.
- People base their perception of norms on an unrepresentative sample of opinions and images, leading to a distorted view of social norms.
- People tend to weight extreme content more heavily when taking the average of a set of stimuli [23] and assume greater moral outrage from a post than the authors report feeling [39].
- The conceptualized average of opinions for one’s ingroup and outgroup in the online world can be far more extreme than true offline norms.
What Norms Dominate Online Discourse?
- Online discourse is dominated by the most extreme people, and negativity, intergroup hostility, and polarization appear strikingly prevalent online [40–43].
- The most widely shared content on Twitter/X and Facebook is moralized content [40, 42, 44].
- The norms generated on some social media platforms might be more hostile than in the offline world.
- Social media feeds are curated to normalize extreme norms, fostering a false reality [46].
- Even if people recognize that certain visible behaviors do not reflect how people actually are, they are still reinforced through “likes.”
- Extreme norms can leave people feeling inadequate.
- Constant exposure to extreme outliers in body shape on platforms like Instagram may contribute to lower body image and depression in teen girls [47].
- Early evidence suggests that spillover does occur in some domains.
- Online content that normalizes alcohol consumption can lead college students to overestimate how common drinking is offline [48–50].
- More research is needed on the impact of online content on perceptions of real-world norms [51].
- These norms contradict social media users’ expressed preferences [52].
- Even though incivility from politicians is increasing and is increasingly socially rewarded online [53], people report that they want to hear less from uncivil politicians [54].
Why Are False Norms Worse Online Than Offline?
- Social media operates in an attention economy, where design features and algorithms are designed to elicit as much engagement as possible [3, 55].
- There is a strong incentive for users to create content that captures attention and maximizes engagement rather than content that reflects reality.
- Users who are the most active on social media are also the most extreme [25, 32].
- News stories that express outgroup animosity are 67% more likely to be shared on social media [42].
- People with more extreme or hostile beliefs tend to dominate discourse, leading to false beliefs about the norms of a community [31].
- Online dynamics are amplified by design features and recommendation algorithms on various platforms [56, 57].
- A recent analysis of the algorithm on Twitter/X found that it prioritizes evocative content [44].
- People focused on gaining social status are the most hostile online [25].
- There is often little motivation for someone to post a nuanced or moderate opinion on social media.
- Nuanced or moderate posts often risk hostility from more extreme ingroup and outgroup members [58].
- People who are politically moderate were more likely to report being harassed online, even though they were also less likely to post [31].
- People who “troll” others typically have higher dark triad characteristics [59].
Conclusion
- Users form beliefs about the state of the world based on the most extreme voices as they scroll content.
- Being overexposed to the most extreme opinions from the most extreme people can have real consequences.
- Believing that one’s political outgroup endorses extreme political positions may lead to biased meta-perceptions, pluralistic ignorance, and false polarization [9, 10, 39, 61].
- It is challenging to differentiate what is normative vs. unpopular when the content that drives the most engagement is often from a minority of extreme users.
- Misperceptions might be driven by factual content, making them uniquely difficult to address through content moderation.
- People may develop a distorted sense of reality as we rely on the funhouse mirror to reflect the truth.
Credit author statement
- Claire E. Robertson: Conceptualization, Writing – Original Draft, Writing – Review and Editing, Visualization.
- Kareena S. Del Rosario: Conceptualization, Writing – Original Draft, Writing – Review and Editing.
- Jay J. Van Bavel: Conceptualization, Writing – Original Draft, Writing – Review and Editing.
Declaration of competing interest
- Jay Van Bavel reports financial support was provided by Google Jigsaw.
- Jay Van Bavel reports financial support was provided by John Templeton World Charity Foundation (TWCF-2022-30561).
Data availability
- No data was used for the research described in the article.