Lecture Notes: Converged Context, Algorithms, and Algorithmic Culture
Announcements and Logistics
A month-ahead assessment worksheet (essay outline) has been released under Assessment > Assessment Essay Outline. Submissions open later, not before September. Review the outline and the accompanying FAQ document and the rubric before starting.
The timing of assessments this semester is challenging: exam in Week 9, followed by a mid-semester break and a public holiday (Charles), then the outline due in the next week, so there isn’t a tutorial right before the due date.
Discussion boards are active and should be used for assessment questions. The examiner reminded everyone that the discussion about the assessments is to be done in advance to accommodate the schedule.
The Converged Context (Recap and Foundations)
Last week’s arc:
Symbolic media: media that depend on embodied interaction.
Technical media: technologies that begin to replace or augment the human body.
Digital media: media that run on binary code and converged practices.
Converged context refers to the present situation where multiple forms of media, technologies, and platforms merge and interact, creating new social and cultural environments.
Henry Jenkins and convergence:
Convergence involves multiple dimensions: media convergence (forms and platforms), content convergence (remixes and cross-media storytelling), and corporate/technological convergence (ownership and ecosystems).
Converged media can enable participatory culture (people participating in remixing and distribution) but can also create monopolies and technopolies (concentration of ownership and control).
Converged contexts create new social conditions: decentralized and distributed networks, online communities, and transmedia storytelling.
Examples of convergence in practice:
Memes and climate-change messaging spread through converged media.
TikTok and other platforms enable replication of popular dances (cat’s-eye dance example) across contexts.
Key social concept: convergence can empower participation but also consolidate power under a few large corporations.
Laswell’s model extended to platforms: communication now involves networks of receivers and producers across media platforms, not just a single receiver.
From Converged Context to Platformed Society: Physical and Social Infrastructures
Physical/network infrastructure of digital media:
Binary processing enables content to be viewed across devices; same content can be consumed on mobile, TV, tablet, etc.
Binary code is processed in a discrete, stepwise manner: and ; equidistant processing.
Transmission media include electricity, light, radio waves; submarine cables under the ocean carry internet traffic.
Paul Baran and distributed networks:
Early military thinking favored decentralized, distributed networks to improve resilience: if one hub fails, others continue to operate.
Centralized networks with a single hub are vulnerable; distributed networks are more robust.
Paul Baran’s distributed network as a design principle for robust communications (sixties context).
Social structures of communication:
Henry Jenkins and the idea of convergence at the social level: content, communities, and corporate ownership converge.
Transmedia story worlds (e.g., Marvel, DC) rely on multiple media forms (comics, films, games, podcasts).
Communities on platforms (Reddit, Facebook, Instagram) converge around shared meanings and content.
The social consequences of convergence:
Democratisation of participation (everyone can participate) alongside concentration of ownership (few companies control ecosystems).
Market-driven content dynamics can shift what is easily accessible and what gets visibility.
Practical example of convergence in action:
Amazon’s ecosystem shows a consolidated media/entertainment/tech/logistics network that minimizes the need to exit the Amazon universe for media consumption.
A single platform can host publishing (Kindle Direct), audiobooks (Audible), film production (MGM Studios), streaming (Prime Video), user-generated content (Twitch), reviews (IMDb), and even hardware devices (Kindle, Fire TV).
Global reach and infrastructure:
Amazon operates in many countries, illustrating globalized platforms’ reach and the ongoing reliance on international logistics and undersea cables.
Submarine cables connect continents, while ships transport physical goods; shipmap.org demonstrates global mobility of maritime traffic (e.g., the year 2012 vs 2032 projections).
The converged context is not unique to Amazon; other large firms (e.g., Disney) follow similar consolidation patterns across media ecosystems.
Algorithms and Algorithmic Cultures
What is an algorithm?
A set of instructions that processes data according to pre-programmed rules to produce outputs such as classifications, predictions, or recommendations.
Algorithms can be infrastructural components of platforms; they are not limited to digital contexts (they can exist in non-digital forms as well).
How algorithms operate in platforms:
Collect data about user behavior, preferences, and interactions.
Use pre-programmed rules to generate recommendations or categorize content.
Output is displayed as recommended items or new content to engage users.
Why algorithms matter:
They power infrastructure for platforms like Amazon, Facebook, Google, etc., shaping what people see and engage with.
A single misstep in an algorithm (e.g., mislabeling content) can have outsized cultural effects.
Ted Stryfus’s key points (as discussed in class):
Algorithms are infrastructural and central to how we engage with digital media.
They enable a form of “algorithmic culture” where private corporations influence what counts as the “best that has been thought and said.”
Algorithmic culture can be seen as a democratic-sounding phenomenon in which everyone can participate, but in practice it is often controlled by the processes of data gathering and private ownership.
He argues that algorithmic culture can be “decisive” and that companies like Amazon, Google, and Facebook act as the new apostles of culture.
The Amazon example as a case study of algorithmic power:
Amazon’s recommendation system shows products based on related metadata and user behavior (e.g., for Brokeback Mountain, related items include author works, related genres, and user-viewed items).
The 2009 “Amazon Gate” incident: a cataloguing error led to the suppression of gay and lesbian literature in recommendations; this demonstrates how a small error in the algorithmic pipeline can shape what gets seen and not seen, underscoring the power of algorithmic systems.
Cambridge Analytica and algorithmic adjustments:
The scandal led to changes in Facebook’s algorithm, illustrating how corporate algorithms respond to public scrutiny and political events.
Algorithmic imaginaries (Taina Booher): how people perceive and understand algorithms.
What situations people become aware of algorithms, how they experience and interpret them, and how awareness affects usage.
Algorithmic imaginaries are not merely false beliefs; they describe the ways people imagine and reason about algorithms and what those imaginaries enable.
Algorithmic imaginaries in practice:
People attempt to outsmart or work with algorithms to go viral (e.g., viral challenges, hashtag campaigns), shaping how content is crafted and shared.
The “how to go viral” discourse reflects users’ attempts to align with algorithmic preferences.
Algorithmic imaginaries and racism online (geopolitical/polarization context):
2020 George Floyd murder sparked global Black Lives Matter protests; online spaces saw competing narratives with some actors promoting White Lives Matter.
Twitter (now X) hashtags and the ethics of algorithmic amplification or drowning out content surfaced; some communities (e.g., K-pop fandoms) counter-mobilized to drown out racist discourse using algorithmic leverage (e.g., flooding the White Lives Matter hashtag with supportive or counter-content).
The outcome is not a “disappearance” of racist content; rather, it moves to other spaces or gets suppressed within one channel, prompting concerns about searchability and visibility across platforms.
The spiral of silence (Elisabeth Noelle-Neumann):
People self-censor in environments where they perceive their views as unpopular or risky to express publicly.
As individuals suppress views, broader opinions may appear to shift, even if the underlying views persist.
This phenomenon interacts with algorithmic filtering: mass media shape the climate of opinion, which then guides who feels safe to speak and who does not.
Implication for assessment: if audiences are not exposed to diverse viewpoints, their judgments may be biased by media representations rather than direct experience.
Content policing and platform governance (examples):
Shadow banning: platforms like Meta (Facebook/Instagram) have been accused of reducing the visibility of certain content or accounts without overt suspensions.
Hashtag bans on Instagram: historically banned hashtags (e.g., “ass,” “suicide,” “self-harm,” “anorexia”) and a broader set of niche or gendered hashtags (e.g., “date,” “hawks,” “pretty girl”) illustrate how moderation practices influence discourse and visibility.
Shadow banning and moderation illustrate how platforms police content and influence what users see and talk about.
Algorithmic imaginaries and online behaviors:
The same mechanisms can be used for good (drowning out harmful content) or bad (hiding legitimate discussion). When harmful content is suppressed, the source content can migrate to other spaces or be reassembled in less visible ways.
The concept of algorithmic imaginaries helps explain why people adopt particular strategies to navigate or manipulate platform algorithms.
Algorithmic imaginaries and contemporary culture:
The idea that platforms are not neutral, and that algorithms actively shape what culture gets produced and circulated.
The rise of “truth platforms” (e.g., Truth) and the diversification of social media spaces reflect attempts to reframe the algorithmic environment, with mixed effects on discourse.
Semiotics and racism (brief example):
The White Lives Matter pamphlet is analyzed semiotically to show how denotation (old-timey aesthetics, chiseled physiques) and connotation (nostalgia, racialized nostalgia, “the changing face of the West”) work to recruit audiences.
The pamphlet uses a circular mirror motif that could symbolically invite the viewer to imagine themselves in the image.
Takeaways on algorithms and culture:
Algorithms are infrastructural: they underlie how we access and engage with culture.
Algorithms are also cultural: they encode and reproduce collective human practices and biases in the process of classifying, predicting, and recommending content.
They are not neutral; they can empower certain voices (e.g., K-pop stans amplifying marginalized content) while suppressing or marginalizing others (e.g., censorship or shadow banning of certain topics).
The interplay of algorithmic systems and human behavior generates a complex, evolving cultural ecosystem where power dynamics revolve around data, infrastructure, and ownership.
The George Floyd Moment, Racism, and Algorithmic Dynamics
The George Floyd murder (2020) and the global Black Lives Matter protests.
White Lives Matter trending on Twitter/X and how algorithmic dynamics affected visibility.
K-pop fandoms and allied responses:
Used algorithmic imaginaries to flood the White Lives Matter hashtags with counter-content, dampening the legibility of racist messages in that space.
Demonstrates how communities can leverage platform mechanics to push alternative narratives, while the racist content does not disappear and often migrates elsewhere.
Spiral of silence in online spaces:
Some groups opt to suppress voices that contradict the dominant discourse in particular contexts (e.g., universities, professional settings).
The broader effect is to shrink perceived public opinion diversity, as people rely on mass media representations for measurement of climate and opinion.
Practical implications:
The persistence of racist content despite algorithmic countermeasures highlights limits of technical moderation and the need for broader social and political interventions.
The event underscores the asymmetric power of platform owners in shaping discourse and the importance of critical media literacy.
Semiotic Analysis and Visual Rhetoric: White Lives Matter Pamphlet
Denotation (what is shown):
Old-timey aesthetic; chiseled jawlines; physically fit, hardy male figures; nostalgic visual register.
The phrase “The changing face of the West” paired with the imagery.
Circular mirror motif in the design.
Connotation (what it implies):
Nostalgia for a bygone era; implied valorization of whiteness and traditional power structures.
The phrase “changing face” signals a perceived threat to the status quo and invites alignment with a white identity project.
The circular mirror invites viewers to imagine themselves as part of the image, enhancing self-identification with the message.
Interpreting the rhetoric:
The imagery and language position a particular political and racial ideology as rightful and natural, leveraging historical aesthetics to appeal to emotion and memory.
Connection to algorithmic discourse:
The pamphlet illustrates how coded racial ideologies can be reproduced and reinforced through media artifacts and cultural consumption, including digitally mediated channels.
Synthesis: Core Messages and Takeaways
Algorithms are infrastructural and cultural:
They are built into the fabric of how we access, organize, and produce culture.
They reflect and reinforce social norms, economic interests, and power relations.
Convergence creates both opportunity and risk:
Opportunities for participation, remix, and cross-platform storytelling.
Risks of monopolies, data exploitation, and manipulation of public discourse.
The social dimensions of algorithms:
Algorithmic imaginaries shape how people understand and navigate platforms.
The spiral of silence can be reinforced by algorithmic filtering and mass-media representations.
The ethics and politics of platform power:
Shadow banning, content moderation, and hashtag policing illustrate platform governance with real-world consequences for discourse and civil society.
Final prompts for the Ted Stryfus reading:
What is an algorithm? How do algorithms operate as infrastructural and cultural forces?
In what ways do algorithmic systems enable or hinder democratic public culture?
How do events like the Cambridge Analytica scandal or Amazon Gate illuminate the power of algorithms?
How do algorithmic imaginaries influence user behavior and content production?
Next steps:
Read the Ted Stryfus article in full for a deeper historical sense of information, crowd, and algorithm.
Watch the video that was attempted in class to enhance understanding of the example cases.
Prepare the essay outline and review the rubric before starting the assignment.
Core References and Concepts Mentioned
Convergence: Henry Jenkins; transmedia story worlds; content convergence; corporate convergence; monopolies vs technopolies.
Paul Baran: distributed network concepts and resilience; distinction from centralized hubs.
Kittler’s binary code concept: and ; equidistant processing; digital networks enable cross-device content delivery.
Laswell’s model extended to networks: the idea of multiple receivers and interactive communication on platforms.
Amazon case study (and related phenomena): proprietary data, recommendations, and the power of algorithmic curation; items affected in the 2009 gate.
Ted Stryfus (Stryfus): algorithmic culture; the power of algorithms in shaping public culture; the quote about algorithmic culture and private control.
Algorithmic imaginaries (Taina Booher): awareness, experience, and potential effects of algorithms on user behavior.
Noelle-Neumann (spelled as Noelle-Newman in the talk): spiral of silence; mass media shaping perception and public opinion.
Shadow banning and hashtag policing (Meta/Instagram): content moderation and its implications for discourse.
White Lives Matter pamphlet: semiotics of denotation and connotation; changing face of the West; visual rhetoric and propaganda.
Thematic connections: the interplay of infrastructure, culture, power, and ethics in algorithmic regimes.