1/45
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Define misinformation
False or inaccurate information shared without intent to deceive; the sharer believes it or is careless.
Define disinformation
False information deliberately created and shared with intent to mislead, manipulate, or cause harm.
What is fake news?
Fabricated, news-like content that mimics journalism’s form but lacks editorial standards and is designed to deceive.
How fake news differs from misinformation/disinformation
It specifically imitates news media format while lacking verification/editorial processes; typically organized and systematic.
Define information warfare (IW)
Strategic manipulation or suppression of information to gain advantage in conflict or politics; a component of hybrid warfare.
Define deepfake
AI-generated or manipulated media (audio/video/image) that convincingly alters identity, speech, or events to fabricate reality.
Define troll farm
Organized workforce running coordinated, often inauthentic accounts to seed and amplify narratives and harass opponents.
Key cognitive biases behind fake news spread
Selective exposure, confirmation bias, desirability bias—users prefer and believe content aligning with prior beliefs and wishes.
How platform algorithms fuel fake news
Engagement-optimized feeds create echo chambers/filter bubbles, repeatedly surfacing similar content and limiting viewpoint diversity.
Role of bots in disinformation
Automated accounts rapidly amplify links and memes, inflate perceived popularity, and drive virality across platforms.
Why monetization matters
Ad and view-based revenue incentivizes sensational falsehoods and funds further disinformation activity.
COVID-19 origin falsehoods and harms
Lab/bioweapon claims, racialized rhetoric; correlated with anti-Asian bias and hate incidents during the pandemic.
COVID-19 “5G causes COVID” claims—impacts
Reduced health-protective behaviors; harassment of telecom workers; dozens of arson attacks on cell masts in the UK.
COVID-19 vaccine disinformation effects
Increased vaccine hesitancy/refusal across multiple countries; undermined public health campaigns.
Examples of fake preventions/cures (COVID-19)
Garlic/kimchi, hot baths/saunas, alcohol, hydroxychloroquine, ivermectin, bleach ingestion/injection—none are effective and some are dangerous.
Documented harms from COVID-19 disinformation
Poisonings, hospitalizations, deaths; measurable public health damage during the “infodemic.”
Define conspiracy thinking
Attributing events to secret plots by powerful actors, often resistant to counterevidence; fuels disinformation uptake.
Define hybrid warfare
Blending conventional military action with cyber operations and information operations/disinformation.
Russia–Ukraine IW core narratives (2014→)
Crimea as “rightfully” Russian; protection of “compatriots”; Ukraine labeled “Nazi”; claims of “genocide” in Donbas to justify intervention.
How Russia distributed disinformation globally
State media (RT/Sputnik), social platforms, Internet Research Agency (IRA) troll farm networks, and bot amplification.
Example of atrocity denial/rewriting (Ukraine)
Bucha massacre reframed as staged by Ukraine or committed by Ukrainians; dissent criminalized via “false info” laws.
Domestic information suppression in Russia (2022)
Criminalization of “false” reporting about the military (up to 15 years), closure of independent outlets, blocking Western platforms.
Define cheap fake
Manipulated media using simple edits (speed changes, cuts, overdubs, recontextualization) rather than advanced AI.
Cheap fake with deadly impact—example
Edited Pakistani PSA recirculated on WhatsApp India as “child-kidnapping CCTV” → mob violence and multiple deaths.
Photo manipulation harm—example
Emma Gonzalez image altered to show her tearing the U.S. Constitution → harassment and discrediting campaign.
Audio/video manipulation in politics—example
Nancy Pelosi slurred-speech videos spread widely; moderation responses varied across platforms.
Deepfake in active war—example
Zelensky deepfake video instructing Ukrainians to surrender during 2022 invasion; used as battlefield disinformation.
Goal of media/information literacy
Equip users to evaluate sources/claims critically and reduce susceptibility to falsehoods.
Platform fact-checking approach
Independent raters review items; labels/warnings applied; downranking to reduce reach; removal for the most harmful cases.
Germany’s NetzDG (2017) summary
Requires rapid removal of unlawful content by large platforms; imposes heavy fines; criticized as privatized censorship.
France’s election disinfo law (2018) summary
Mandates removal of manifestly false information that could harm electoral integrity during campaigns.
Core regulatory tension
Balancing free expression (especially in U.S.-style systems) with harm reduction and democratic integrity.
Deepfake detection—key forensic cues
Image warping, lighting inconsistencies, unnatural smoothness, odd pixel patterns; ML-based detectors learn these artifacts.
Authentication/provenance idea for deepfakes
Device-level watermarking/provenance (“truth layer”) to verify originals; requires broad, cross-industry adoption.
Common platform policy gaps on deepfakes
Parody/satire exemptions, limited coverage of “cheap fakes,” inconsistent enforcement for political content.
Why legal remedies often fall short
Defamation/privacy/copyright suits are reactive, slow, costly; anonymity/jurisdictional issues; state actors often immune.
U.S. state actions on deepfakes (examples)
Bans on political deepfakes near elections in some states; civil remedies for deepfake porn; no comprehensive federal statute.
EU Digital Services Act—deepfakes
Providers that become aware content is a deepfake must label it as inauthentic so users are informed.
Why deepfakes are uniquely dangerous
They erode evidentiary trust (“seeing is believing”), enable plausible deniability, and scale political and image-based abuse harms.
Echo chambers’ impact on democracy
Narrow information diets polarize citizens, entrench false narratives, and distort electoral decision-making.
Define filter bubble
Algorithmically curated environment reflecting past preferences, limiting exposure to diverse/contrary viewpoints.
Misinformation vs disinformation (exam tip)
Intent difference: misinformation lacks intent to deceive; disinformation is deliberate deception to mislead/manipulate.
Why reactive legal remedies are insufficient
Harm often occurs before takedown; litigation delays/costs; cross-border and anonymous perpetrators evade accountability.
Comprehensive strategy against online falsehoods
Media literacy, robust fact-checking, detection/authentication tech, targeted regulation, and carefully scoped laws.
Summary of real-world harms from online falsehoods
Illness/death (COVID), infrastructure attacks (5G), hate incidents, voter manipulation, wartime atrocity denial.
Core challenge going forward
Coordinating cross-jurisdictional, rights-respecting responses as platforms, technologies, and adversaries rapidly evolve.