Attitude Change and Persuasion — Week 5 Notes
Attitude Change: Overview and Mechanisms
- Attitude change = psychological process where an initially formed attitude is modified.
- Occurs in two broad ways:
- Spontaneous exposure to persuasive content (e.g., billboards, ads) leading to automatic attitude shifts.
- Deliberate persuasion (e.g., shopping persuasion, political campaigns) aimed at changing attitudes.
- Attitudes vary in strength:
- Strong attitudes: highly persistent, color judgments, stable over time, resistant to change.
- Weak/moderate attitudes: more amenable to change.
- People are not perfectly objective when encountering new information due to cognitive biases and prior attitudes.
- Key cognitive biases and concepts discussed:
- Confirmation bias: preference for information that confirms preexisting beliefs; discomfort with inconsistent information.
- Selective exposure: seeking information consistent with preexisting attitudes; avoiding inconsistent information.
- Biased assimilation: interpreting new information to fit preexisting attitudes, even when information is neutral.
- Self-Verification motive: information that aligns with how we see ourselves is given more weight.
- Practical implications: selective exposure and biased assimilation help explain why attitude change is hard and why polarization persists.
Classic demonstrations of selective processing and bias
Polarizing topics in the U.S. (gun control, abortion, Medicare access, minimum wage) used to study selective exposure:
- In Session 2, participants chose and read articles framed as supportive or opposing their preexisting attitudes.
- Findings:
- People tended to read articles that matched their preexisting attitudes (attitude-consistent articles).
- They spent more time on attitude-consistent content and less on attitude-inconsistent content.
- Also observed: biased assimilation—participants interpreted new information to fit their views, rating attitude-consistent studies as more sound and anti-consensus studies as less sound depending on initial stance.
Death penalty study (classic):
- Participants with pro- or anti-death-penalty attitudes read two studies arguing for and against.
- Proponents found pro-study more methodologically sound and convincing; opponents found anti-study more sound and convincing.
- Demonstrates how preexisting attitudes shape judgment of study quality.
Climate change skepticism and trust in science:
- Data suggest distrust of science is not confined to conservatives; mechanisms of selective interpretation operate across the political spectrum.
- Studies show conservatives exhibit stronger resistance to dissonant science, but liberals also show selective resistance in different topics; polarization effects are not monolithic.
- Extremes (conservatives vs liberals) show the largest gap in resistance to dissonant information; moderates fall closer to the center.
Vested interest demonstration (Kmart vs Target):
- Participants exposed to a persuasive message framed as coming from a Kmart worker with vested interest vs from a Kmart worker telling you to shop at Target (low vested interest).
- Result: low-vested-interest source more likely to elicit attitude change than a highly vested-interest source (n≈41 vs n≈44; reported as significant difference).
- Implication: perceived vested interest can dampen attitude change; sources perceived as neutral or with less personal stake are more persuasive.
In-group vs out-group and vested interest in research perception:
- UK participants evaluated a researcher’s vested interest and trustworthiness when the researcher studied weight stigma vs ageism.
- Findings: when weight stigma was the focus, larger-bodied researchers were perceived to have greater vested interest and were trusted slightly less; smaller-bodied researchers were trusted slightly more in that domain.
- Suggests that researchers studying their own in-group can be viewed as biased, reducing perceived objectivity and trust in their findings in some contexts.
False consensus effect (social consensus):
- People often overestimate how many others share their views (false consensus).
- Example: among Australians, only a small minority held strong negative attitudes toward asylum seekers, yet 83% believed that a majority shared their view.
- Actual social consensus (when present) can counter false consensus effects by signaling the true majority attitude.
- Practical use: informing interventions by highlighting actual consensus can foster attitude change toward the real norm.
- Conspiracy thinking can dampen responsiveness to consensus information; conspiracists distrust mainstream explanations and consensus signals.
Elaborating attitude change: processing routes (ELM)
Elaboration Likelihood Model (ELM) as a dual-route theory:
- Central processing route: high elaboration of message content; careful consideration; attitude change driven by argument quality.
- Peripheral processing route: low elaboration; attitude change driven by superficial cues (e.g., source attractiveness, credibility cues) rather than argument content.
When each route is used:
- Central route occurs with high motivation and high cognitive capacity (time, resources) to process the information.
- Peripheral route occurs with low motivation or low ability to process information (e.g., time pressure, cluttered environment).
Implications for persuasion:
- For high-stakes, relevant issues, central route processing is more likely; durable attitude change depends on the strength and quality of arguments.
- For low-stakes, everyday choices, peripheral cues can drive quicker, less durable attitude shifts.
Simple illustration:
- Central route: A
ightarrow ext{depends on argument quality } Q ext{ when } M ext{ and } A ext{ (motivation and ability) are high}. - Peripheral route: A
ightarrow ext{depends on cues } C ext{ when } M ext{ or } A ext{ are low}.
- Central route: A
Examples of central vs peripheral in everyday life:
- Central route example: evaluating a career decision or major environmental policy involves careful consideration, evidence weighing, and long-term implications.
- Peripheral route example: a street ad or a low-stakes product purchase (e.g., shampoo) may rely on scent or visuals rather than deep argumentation.
Advertising demonstration (peripheral route):
- A long advertisement was shown to illustrate emotional cues rather than clear information about product quality (e.g., Subway ad using emotional signals rather than explicit value about the product).
- Conclusion: peripheral cues can be persuasive even when content quality is not strongly informative.
Source credibility and vested interest as modifiers of attitude change:
- Credible sources are characterized by trustworthiness and expertise; likely to produce attitude change when audiences value the topic.
- However, credibility is not binary; perceived vested interest can reduce credibility even for trusted experts if they are seen as biased or having a stake.
- Source credibility can be undermined if the source is perceived to have a personal or financial stake in the outcome.
- Practical takeaway: to maximize attitude change, tailor message and source to match the audience and minimize perceived vested interest when aiming for broad acceptance.
Vested interest and in-group dynamics (demonstration):
- Studies show that sources seen as studying their own in-group can be perceived as having greater vested interest, reducing trust; but the effect varies by topic and domain (e.g., weight stigma vs ageism).
- The broader implication is that audience perceptions of bias can shape the effectiveness of persuasive attempts depending on how closely the source aligns with the audience’s identity.
Social consensus as a lever for persuasion:
- When you communicate that a majority of people share a belief, people often align with that norm to avoid social sanction and to conform with perceived majority.
- However, conspiracist thinking can blunt this effect, as conspiracists distrust consensus information.
Message framing and moral foundations (Moral Framing):
- Framing a message to align with the recipient’s values increases the likelihood of attitude change.
- Moral foundations theory distinguishes conservatives and liberals in terms of prioritized moral values:
- Conservatives tend to emphasize patriotism, traditionalism, sanctity, purity.
- Liberals tend to emphasize compassion, care, fairness, equality.
- Moral reframing research suggests tailoring pro-environmental messages to the audience’s values (e.g., patriotic framing for conservatives) can reduce polarization and increase openness to the message.
- Example finding: in a U.S. study, a patriotic frame reduced liberal-conservative polarization, making conservatives more receptive to pro-environmental attitudes, while liberals remained responsive to care-based framing.
- Important caveat: people trust messages coming from their political in-groups more than out-groups; excessive reliance on in-group framing can risk reinforcing polarization.
Cross-topic environmental attitudes and polarization (Australia/U.S. examples):
- Left-leaning voters typically rate environmental protection as highly important; right-leaning voters may prioritize different values.
- Patriotic framing can bridge some of the polarization gap more effectively than traditional compassionate frames for certain conservatives, illustrating the value of value-aligned framing.
Activism and persuasion strategies (activist dilemma):
- Activists aim to shift societal attitudes on issues they care about and may use a range of tactics from peaceful to extreme.
- Pros of radical tactics:
- Increase social and financial pressure on institutions; raise public awareness; generate media coverage.
- Cons of radical tactics:
- Can alienate observers and reduce broad public support for the movement.
- Experimental evidence on protest tactics:
- Protests manipulated for disruptiveness (moderate vs extreme) and stance (pro gun ban vs anti gun ban) showed that more extreme tactics reduced moral engagement, emotional connection, identification with the movement, willingness to join, and overall support for the cause.
- Overall takeaway: peaceful, non-violent but disruptive tactics tend to maximize broad public support and engagement.
Preventing attitude change in the age of misinformation
- Why prevention matters: widespread rejection of science and misinformation can have dangerous real-world consequences (e.g., misinformation-driven protests and anti-immigrant sentiment).
- General anti-misinformation strategies discussed:
- Promote deliberate, slow information processing (debate and critical thinking) to reduce belief in false headlines while preserving true beliefs.
- Debunking after misinformation exposure can still leave residual effects; correction is difficult once beliefs are formed.
- Forewarning (pre-bunking) helps people anticipate misinformation before exposure, but can provoke resistance if seen as manipulation.
- Psychological inoculation (inoculation theory): an effective protective strategy against misinformation. Three steps (as outlined in the lecture):
1) Forewarn: make people aware of the various guises and forms of misinformation they might encounter.
2) Counter-arguments: provide concrete counterarguments or information that can be used to refute misinformation.
3) [The third step is described as part of the model in the lecture but not explicitly named in the transcript; it is related to practicing or rehearsing how to apply the counterarguments in real situations.]
- Note: this step was not explicitly named in the transcript, but is typically understood as rehearsal or application of refuted arguments.
- Large-scale inoculation resources and campaigns have been shown to be effective; examples include gamified interventions that teach people to spot misinformation:
- Bad News (game): improves ability to spot misinformation and resist sharing it.
- Bad Facts (game): similar protective effects.
- Cranky Uncle (game): targets climate misinformation with a focus on recognizing manipulation tactics (e.g., conspiracy theories, fake experts).
- Practical implications: inoculation-based programs can be adapted to current misinformation contexts (COVID-19 to climate change) and used as preventive tools in education and public outreach.
Quick takeaways for exam-ready understanding
- Attitude change is complex and influenced by cognitive biases, motivational states, and the credibility of the source.
- The Elaboration Likelihood Model explains when people are likely to be persuaded by strong arguments (central route) versus superficial cues (peripheral route).
- Perceived vested interest and source bias can undermine attitude change, even when the message is accurate.
- Social consensus signals and framing can significantly shift attitudes, but polarization can be mitigated through value-aligned framing (moral reframing).
- Activism presents a trade-off between visibility and broad support; moderate, nonviolent tactics tend to maximize public endorsement.
- Preventing misinformation relies on inoculation (forewarning + counter-arguments + rehearsal), debunking, and promoting deliberation; gamified interventions can be effective public-facing tools.
Exam-ready prompts (concepts to recall)
- Differentiate spontaneous vs deliberate attitude change with examples.
- Explain confirmation bias, selective exposure, and biased assimilation; provide a real-world example from the transcript.
- Describe the Elaboration Likelihood Model and give an example of central vs peripheral route processing.
- Define source credibility and vested interest; discuss how vested interest can paradoxically reduce credibility in some contexts.
- Explain social consensus and the false consensus effect; provide a real example from asylum-seeker attitudes.
- Compare moral framing (patriotism-based vs care-based) and discuss when each might reduce polarization.
- Summarize the activist dilemma and the empirical findings on moderate vs extreme protests.
- Outline the three steps of psychological inoculation against misinformation and the evidence for online inoculation games.
- Discuss how polarization can be both a natural outcome of information processing biases and a target for intervention through framing and consensus information.
Connections to broader themes
- Links to self-identity and self-verification: how we seek information that confirms how we see ourselves.
- The role of cognitive dissonance in resisting information that contradicts our attitudes, and how dissonance can be exploited to promote behavior change.
- The ethics and practicality of using framing and inoculation in public communications; balancing persuasive goals with respect for autonomy and informed consent.
- The impact of algorithms on echo chambers and attitude polarization in digital environments; the importance of countering misinformation with proactive inoculation and consensus messaging.
Summary of key concepts and terms (glossary)
- Attitude change: modification of an already formed attitude.
- Confirmation bias: tendency to search for, interpret, and remember information that confirms one’s preconceptions.
- Selective exposure: seeking information consistent with one’s beliefs and avoiding conflicting information.
- Biased assimilation: interpreting new information in a way that supports one’s preexisting attitudes.
- Self-verification motive: seeking information that confirms one’s self-view.
- Central route (ELM): processing route with high elaboration, arguments quality matters.
- Peripheral route (ELM): processing route with low elaboration, cues matter more.
- Source credibility: trustworthiness and expertise of the information source.
- Vested interest: personal stake in an outcome that can bias message reception.
- Social consensus/false consensus: perception of how many others share one’s attitude, which may be illusory.
- Moral framing / moral foundations theory: tailoring messages to align with the recipient’s core values.
- Inoculation (prebunking): inoculation strategy that forewarns and provides counter-arguments to strengthen resistance to misinformation.
- Bad News / Bad Facts / Cranky Uncle: gamified interventions designed to improve people’s ability to spot and resist misinformation.
- Activist dilemma: trade-off between tactics that maximize visibility and those that maximize broad public support.
- Cognitive dissonance: psychological discomfort from holding conflicting attitudes or attitudes-behavior inconsistency.