Notes on Social Media Influence and Propaganda

  • Internet Research Agency (IRA)

    • Responsible for the Russian state-sponsored social media campaign
    • Created a Facebook group 'Being Patriotic' in 2016
  • Key Incident

    • Posted a meme of a veteran encouraging likes and shares if veterans should receive benefits before refugees
    • Claimed that 620,000 refugees would cross the US/Mexico border while over 50,000 homeless veterans were dying, based on a statement from Donald Trump's campaign
    • The refugee claim was later refuted by Politifact
    • Despite misleading intent, more than 640,000 users shared the meme
  • Objectives of the IRA's Campaign

    • Undermine democratic functioning in the US and Europe
    • Spread misleading, divisive content
    • Techniques included producing and sharing various controversial merchandise (e.g., T-shirts, sex toys)
  • Disinformation vs. Misinformation

    • Disinformation: Intentional falsehoods spread for political/economic gain
    • Misinformation: False or misleading content shared without deceitful intent
    • Distinction crucial for understanding regulation needs
  • Social Media Impact

    • Social networks amplify information spread via peer sharing
    • Personal trust influences belief adoption (e.g., friends’ shared content is more likely accepted)
    • The blending of disinformation into misinformation complicates regulation
  • Viral Nature of Social Media

    • Pre-social media, content broadcasting was limited to few organizations.
    • Now, individuals can amplify messages widely, obscuring the origins of misinformation
    • Peer-to-peer sharing plays a significant role in the propagation of ideas
  • Analyzing the Veteran Meme

    • Hard to classify strictly as misinformation or disinformation due to its dual nature
    • Designed to mislead but also propagated by users without deceit
    • Social connections and trust complicate simple labels
  • Content Types

    • Not all propaganda is based on false claims; some aim to manipulate opinions without lies (e.g., calls to action)
    • Example: Statements intended to dissuade voters rather than misinform about specific facts
    • True statements can also mislead in context (e.g., claims about jury summons)
  • Campaign Dynamics

    • Effective memes require minimal effort to create versus the amplification they receive
    • Emotional and divisive content drives higher engagement thanks to algorithms
  • Policy Considerations

    • Policies should address the amplification of misinformation as well as pure disinformation
    • Censure of individuals sharing misinformation may infringe on free speech
    • Social media platforms need accountability for their algorithms and the impacts on democracy
  • Recommendations for Regulation

    • Develop a framework to evaluate recommendation algorithms for potential exploitation
    • Identify propaganda effectively without amplifying it for individual feeds
    • Quick adaptation of methods by social media firms is essential to counter evolving propaganda strategies
  • Concluding Insight

    • The nature of modern propaganda demands a nuanced understanding of misinformation and disinformation
    • Recognizing the relationship between social media dynamics and belief propagation is vital for effective policy.