INST201: Introduction to Information Science — Exam 2 Study Guide Spring 2026
Exam Overview
Date: April 3, 2026
Format: In-class, closed book, closed notes
Total Points: 80 points
Coverage:
AI
Privacy & Security
Surveillance
Economics & Labor
Social Media
Online Communities
Assigned Readings/Videos:
DeepMind et al., Ethical and Social Risks of Harm From Language Models
Trump Administration. White House. America's AI Action Plan
Crash Course: Social Media
Carr & Hayes. Social Media: Defining, Developing, and Divining
Zuboff. Surveillance Capitalism (Chapter 1)
Ben Jordan's piece on Flock Safety
Question Types:
Multiple choice
True / False
Short answer
Critical thinking essay
Artificial Intelligence Core Concepts
Definition of AI vs. Machine Learning:
Artificial Intelligence (AI) encompasses systems that simulate human intelligence to perform tasks that typically require human cognition, such as understanding language, recognizing patterns, and making decisions.
Machine Learning (ML) is a subset of AI that focuses on algorithms and statistical models that allow computers to perform specific tasks without explicit instructions, instead relying on patterns and inference.
Is AI actually 'artificial' and 'intelligent'?
This question examines the true nature of AI, questioning if it mimics human intelligence and what constitutes 'intelligence' in machines.
What is intelligence? (Melanie Mitchell's observation):
Intelligence is often defined in various ways, but it generally involves the ability to learn from experience, adapt to new situations, understand complex ideas, and engage in reasoning.
Moravec's Paradox:
This paradox states that high-level reasoning requires relatively little computational power, while low-level sensorimotor skills require enormous computational resources. Thus, tasks that are simple for humans are difficult for machines and vice versa.
Crawford's argument — AI as material and embodied:
Kate Crawford argues that AI systems are not just abstract algorithms; they are material and embodied systems that have a real-world impact based on their design and implementation.
Opportunities and Risks
OECD Potential Benefits of AI (general categories):
AI offers potential benefits in enhancing productivity, improving healthcare, and fostering innovation.
Three Categories of Risk:
Malicious Use: Potential for deliberate harm, such as using AI for cyberattacks or misinformation.
Malfunctions: Errors in AI systems that could lead to unintended consequences.
Systemic Risks: Broader societal impacts of widespread AI implementation, such as job displacement and inequality.
Real-world example: DeepMind and the NHS data case:
This case highlights ethical concerns regarding data use, consent, and the potential consequences of decision-making in healthcare using AI.
AI and Society
AI as a new kind of information network:
AI is positioned as a transformative network influencing how information is processed and shared.
Historical comparison:
Comparison to historical information networks such as the printing press, radio/TV, and social media, illustrating the evolving dynamics of information control.
Who tries to control new information networks, and why:
Entities ranging from governments to corporations seek control to influence public discourse, protect interests, or maintain power.
Information countermovements and decentralization:
Movements advocating for decentralized information systems to counterbalance the concentration of power within traditional networks.
AI as an amplifier of agency:
AI has the potential to empower users by enhancing their capabilities, but it also risks reinforcing existing power structures.
Are we moving toward an Algorithmic or Attention-Dominated Society?
This discussion focuses on whether society is prioritizing algorithms that dictate attention or if individuals may reclaim agency.
Zuboff and Crawford's arguments about the data-extractive society:
Zuboff describes contemporary capitalism as being predicated on extracting personal data for profit, impacting civil liberties.
Wu and Citton's arguments about the attention economy:
They argue the economy is increasingly structured around capturing and monetizing user attention through digital platforms.
AI in Policy
Biden vs. Trump administration approaches to AI:
A comparison of differing governmental philosophies and policies toward AI: ethical considerations and regulatory frameworks.
California SB 53:
Legislation detailing California's approach to the governance of AI technologies.
AI copyright lawsuits — general landscape:
Overview of ongoing legal debates about AI-generated content and intellectual property rights.
Character.AI and mental health concerns:
Examination of AI companions and their implications for human mental health and emotional well-being.
Privacy & Security
Definition of Privacy:
Privacy is the right of individuals to maintain control over their personal information and to be free from unauthorized intrusion.
Three components of privacy:
Personally Identifiable Information (PII): Information that can be used to identify an individual.
Physical Access: The right to control who has access to one's physical space.
Freedom from Undue Influence: The ability to make decisions without coercion or manipulation.
Four reasons privacy matters:
Protection from Misuse of PII: Preventing exploitation of personal information by malicious actors.
Relationships: Privacy fosters trust in interpersonal relationships.
Autonomy: Privacy underpins individual autonomy and self-determination.
Human Dignity and Power: Essential for maintaining dignity and power over one’s life.
The Nothing-to-Hide Argument — and why it fails:
We all have secrets: Privacy is intrinsic to human dignity, regardless of the visibility of personal actions.
Disclosure and the aggregation problem: Sharing incremental data can lead to comprehensive profiling.
No-fault attacks: The privacy-preserving concerns are not only about guilt or innocence but also about potential risks of data exposure.
Cyber Security
Definition of a security problem (vs. a simple malfunction):
A security problem arises when there is potential for unauthorized access or damage, whereas a malfunction is simply a failure of the system’s performance.
CIA Triad — all three components:
Confidentiality: Protection of information from unauthorized access.
Integrity: Assurance that the information is reliable and untampered.
Availability: Ensuring that authorized users have access to data and resources when needed.
Software vulnerability, exploit, and malware — distinctions:
A vulnerability is a weakness that can be exploited, an exploit leverages a vulnerability to compromise a system, and malware refers to malicious software designed to harm or exploit.
Types of attacks:
Virus: A self-replicating program that attaches to files.
Worm: A standalone malware that replicates itself to spread to other systems.
Watering Hole: A strategy where the attacker compromises a site likely to be visited by the target.
Social Engineering / Spear-phishing: Manipulative techniques used to exploit human vulnerabilities.
Vulnerability disclosure and Bug Bounties:
Programs that incentivize individuals to report vulnerabilities instead of exploiting them maliciously.
Privacy and Security convergence — why they are merging:
The increasing overlap between privacy concerns and security measures as organizations aim to protect user data while ensuring secure systems.
Privacy vs. National Security tension:
An ongoing debate on the balance between safeguarding individual privacy rights and ensuring national security.
NSA history and domestic surveillance:
Overview of the National Security Agency’s role and history in monitoring communications for security purposes.
Section 702 of FISA:
A legal framework that allows surveillance of foreign persons outside the United States without a warrant, often impacting citizens’ privacy rights as well.
Datafication & Surveillance
Definition of surveillance:
Surveillance is the monitoring of behaviors and activities by an individual or group, typically in a systematic way.
Four characteristics of surveillance:
Unequal information gathering: Disparities in what information is collected from different groups.
Establishing hierarchies and power: Surveillance reinforces social hierarchies through information asymmetries.
Enacting control after the fact: Ability to monitor actions after they have occurred, influencing future behavior.
Inducing self-discipline (Hawthorne Effect): Individuals may change their behavior when they know they are being watched.
Surveillance Capitalism — Shoshana Zuboff:
Definition: A term coined by Zuboff describing the new economic system where personal data is commodified and used for profit.
Traditional capitalism vs. surveillance capitalism: The former focuses on material goods, while the latter relies on data extraction, especially from individuals.
Behavioral surplus: Data produced through user behavior that exceeds what is necessary for services initially promised.
Real-world examples:
Google/Alphabet: Data-driven business model relying heavily on user data for targeted advertising.
Meta: Corporate practices regarding data collection from social media usage.
Flock Safety: Company deploying surveillance technology for security purposes, illustrating ethical concerns.
Data brokers — definition, incentives, and practices:
Companies that buy and sell personal data for various purposes, often lacking transparency.
Ways to address surveillance:
Solutions include legislative action, increased public awareness, and competition in technology markets.
Economics & Labor
Types of Economy:
Information Economy: Economy primarily focused on creating, distributing, and using information.
Platform Economy: Relies on online platforms that connect service providers with consumers.
Sharing Economy: Emphasizes collaborative consumption and sharing of resources.
Creator Economy: Focused on independent content creators generating income through platforms.
Attention Economy: Centers around monetizing attention via advertisements and engagement.
Gig Economy: Economic model involving short-term, flexible jobs, often through digital platforms.
The Advertising Model
Freemium model:
A business tactic where basic services are provided for free while premium features are charged.
The four-step advertising model:
Free service → Data → Targeting → Attention: Users get free service; data is collected and used for targeted advertising which seeks to capture user attention.
Ethan Zuckerman — The Internet's Original Sin:
Argues that surveillance is integral to the web's business model, leading to engagement being prioritized over user experience.
Users as the product: Their attention is monetized, making them less customers and more products in this economy.
Online Advertising by the Numbers
Statistics highlighting the scale and economic heft of online advertising, emphasizing its impact on user privacy and the structure of the digital economy.
Gig Workers
Definition of the gig economy:
A labor market characterized by short-term contracts and freelance work instead of permanent jobs.
Why gig workers are NOT employees:
Legal distinctions arise in labor protection, benefits, and worker rights due to the classification of gig labor.
Role of the platform in gig labor:
Platforms serve as intermediaries that facilitate gig work, exerting control over working conditions.
Algorithmic management:
The use of algorithms to manage and coordinate gig workers, impacting their labor experiences and conditions.
Uber as a case study:
Focus on how Uber established its operations through a growth-over-profit strategy, introduced dynamic surge pricing, and faces legal issues regarding worker classification.
Uber BV v Aslam (2021) — UK ruling: Landmark decision regarding worker status and rights of gig workers in the UK.
California AB5 and Proposition 22: Legislative attempts to regulate gig worker status and rights, illustrating ongoing legal battles in defining gig economy work.
Key takeaway: Economic risks are often transferred from the firm onto individual workers in the gig economy.
Content Moderators
Definition of content moderation:
The practice of monitoring user-generated content to enforce guidelines and regulations on digital platforms.
The moderation challenge:
Balancing the need for oversight with the risk of overreach or allowing harmful content.
Four characteristics of moderation labor:
Emotional toll, psychological impacts, perspectives of subjectivity, and visibility in the face of operational demands.
DSA Transparency Database:
Data points shedding light on content moderation practices and the extent of labor involved in this field.
Content moderation as unseen labor:
The essential but often overlooked work that supports social media environments, emphasizing the stressors faced by moderators.
Influencers
Definition of an influencer:
Individuals who leverage their online presence to affect the purchasing decisions and perceptions of their followers.
How influencers monetize:
Through advertisements, affiliate marketing, platform revenue, merchandise, and subscriptions.
Aspirational Labor — Brooke Erin Duffy:
Concepts surrounding the motivations, aspirations, and labor demands placed on influencers.
Influencer income reality:
Variability in earnings, often highlighting vulnerabilities within this labor model.
The algorithm as the boss:
Algorithms determine which influencers are seen and promoted, reinforcing power dynamics in the industry.
Secondary markets created by influencers:
New economies generated by influencer content and services.
Comparison to gig workers:
Shared vulnerabilities regarding labor protections and reliance on platforms, creating a parallel between influencer work and traditional gig roles.
Social Media Definitions and Characteristics
Multiple definitions:
Definitions vary among scholars, noting nuances and the multifaceted nature of social media.
Carr & Hayes (2014) definition:
Most comprehensive definition used in class capturing essential characteristics of social media.
Five characteristics of social media (Carr & Hayes):
Internet-based: Rooted in the digital realm.
Persistent channels: Communications that endure beyond the initial engagement.
Perceived interactivity: Users engage dynamically and interactively.
User-generated value: Content created and valued by users themselves.
Mass-personal communication: Blending mass media dissemination with personal engagement.
History and Timeline
1960s–1980s: Development of Email, Bulletin Board Systems (BBS), and Usenet, establishing early forms of digital communication.
1990s: Emergence of web services, including GeoCities and Classmates.com, allowing for personal page creation and early Social Networking Sites (SNS).
2000–2005: Growth of Web 2.0 technologies, including blogs, wikis, and the beginnings of social networks.
2006 onward: Proliferation of platforms like Twitter and Instagram and the solidifying of the platform economy.
Social Media and Society
Social media curation and its commercial incentives:
The economic drivers behind selective content management on social platforms.
Is social media an online community? (apply Baym's five qualities):
An analysis of whether social media platforms meet criteria defining a community.
Social media addiction:
Exploration of the phenomenon termed social media addiction, including definitions and statistics.
Mental health and social media:
Examination of the relationship between social media use and mental well-being, informed by empirical research.
Zuckerberg's claim and why it is misleading:
Critical evaluation of statements made by social media executives and their implications.
Meta on trial:
Discussions around company practices regarding responsibility for harmful design choices and potential liability.
There is no unbiased social media:
Acknowledgment of the inherent biases present in social media algorithms and practices.
Online Communities
What Makes an Online Community
Baym's five qualities of online communities:
Space: A place for engagement and interaction.
Shared Practice: Common behaviors or rituals among users.
Shared Resources & Support: Availability of content or aid among community members.
Shared Identity: A common sense of belonging.
Interpersonal Relationships: Connections that form between community members.
The difference between a forum and a true community:
Distinguishing a simple online forum from a community based on user engagement and connection.
Ray Oldenburg's Third Place concept:
The idea of a ‘Third Place’ as a social environment outside of home and work that fosters community interaction.
Types of Online Communities
Place-based online communities:
Examples include platforms like NextDoor or local neighborhood groups connecting users by geography.
Interest-based, identity-based, and practice-based communities:
Communities emerging around shared interests, identities, and collaborative practices.
Online Identity
Personal identity vs. social identity:
Examination of how online identities may diverge from one’s real-life identities.
Disembodied identities online:
The phenomenon where individuals present themselves differently in online spheres.
Imagined audiences:
Understanding how users perceive and create content with a specific potential audience in mind.
Self-branding:
The active process of creating a public persona for oneself through digital means.
Goffman's concept of multiple social roles — applied online:
Analysis of how individuals curtail their identities depending on social contexts, drawn from Erving Goffman's theoretical frameworks.
Study Tips
Review all lecture slides:
Ensure comprehension of key concepts and topics.
Review all reading materials and videos:
Familiarize with perspectives and arguments presented across media.
Understand concepts and think about real-world examples:
Relate concepts explicitly to current events or personal experiences.
Be able to explain ideas in your own words:
Reinforces understanding and recalls during the exam.
Practice distinguishing between similar terms:
Gain clarity on nuanced differences that could be tested.
Focus on understanding the 'why' behind each concept:
Comprehending underlying theories aids in critical thinking.
Remember
Budget your time during the exam:
Strategically allocate time for each question section.
Read questions carefully:
Insightfully understanding questions helps filter out what is being asked.
Answer what is asked:
Directly and succinctly address each part of the question posed.
Use specific concepts and terminology from class:
Employing accurate language evidences confidence and understanding.
For essays, address all parts of the question:
Thorough responses require engagement with every element of the prompt.
Leave time to review your work:
Final checks can help catch mistakes or enhance argument clarity.