1/22
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Case study: Robert Moses’s overpass
Winner examines Robert Moses’s overpasses, roadway system and park in New York Metropolitan area, built in mid 20th-century
Designed lower than usual overpasses preventing buses from using Long Island parkways
Made it difficult for disproportionately non-white bus users from accessing public parks like Jones Beach. His intentions were unclear.
HCE tools:
identity/positionality - non-white bus users were disproportionately affected bc of their race and economic status
classification - design choice created a classification of who could or could not access the beaches on that side of Long Island
power - Robert Moses had immense power by creating infrastructure that altered the behavior of the public
sociotechnical system - actions of bus users (social) intertwined with the overpass (technology)
Reading: “Do Artifacts Have Politics?” by Langdon Winner
Core argument: Technology is never neutral; it's political. The things we build—like roads, machines, or software—are set up to favor certain people and groups, giving them power or taking it away from others.
Design choices can exclude people (The Moses Effect). An artifact's specific design features can be used to control society or create inequality on purpose.
Simple Example: Robert Moses built low overpasses on his parkways so buses couldn't pass. This kept poor people and minorities, who relied on buses, away from that side of Long Island.
Technology can be used to fight workers: Companies sometimes bring in new machines not for better efficiency, but to destroy a union or get rid of skilled workers they don't like.
Some technologies require authoritarian control to keep them running safely
Simple Example: The atom bomb must be managed by a rigid, centralized command structure; there's no other safe way.
HCE tools (atom bomb):
power - atom bomb is the ultimate expression of power
sociotechnical system - atom bomb (technology) and rigid, centralized command (social structure) are inseparable
agency - the rigid, centralized command structure is explicitly design to remove agency from everyone in the system except for the people at the top
narrative - the narrative justifies that the command structure around an atom bomb is the only safe way to contain this technology
Case study: Rental market (RealPage)
Converting private lease and competitor data that algorithms use to automate and optimize pricing decisions
HCE tools
power - RealPage’s algorithms concentrate power over pricing decisions to property owners altering the behavior of renters
sociotechnical system - the actions of people (landlords and renters) and technologies (pricing algorithms) are intertwined
performativity - the algorithm doesn’t just describe the optimal market price, it also performs/creates it
representation - algorithms rely on data as representations of market reality
Case study: Bay Area Air Quality Management District (BAAQMD)
Air pollution is abstracted into the Air Quality Index (AQI), created by BAAQMD scientists using official monitoring stations and complex models.
Data is made to ensure regulatory compliance (federal standards) and provide public health warnings.
The official "View from Above" (AQI) is often challenged by the "View from Below" (EJ groups/residents) who use localized data to highlight missed neighborhood pollution.
HCE tools:
classification - air pollution is categorized into governable bins (ex. Good, Moderate, Unhealthy)
representation - AQI score is technological representation of air quality
power - BAAQMD “defines the truth” of the air quality which alters the behaviors of populations and systems (ex. schools)
narrative - 2 conflicting narratives of view from above vs. view from below
Reading: The Case of Race Classification and Reclassification Under Apartheid by G.C. Bowker and S.L Star
Core Idea: Bureaucratic tools, like classification systems and databases, are powerful weapons of injustice, even when they seem "technical.”
The Goal: The apartheid government tried to fit everyone into rigid, unambiguous race boxes (like a fixed color on a chart) to legally enforce segregation.
The Reality: Because race isn't simple, officials had to use unscientific tests (like the pencil test for hair) and personal judgment to decide a person's fate.
Life as Surveillance: The passbook was a 95-page mobile record that tracked every detail of a Black person's life (jobs, taxes, etc.) in a centralized database. Losing it or having a technical error meant immediate arrest and ruin.
The Result: When the government's rigid classification (the box) violently clashed with a person's actual life story, it caused immense emotional and social damage.
Simple Example: Sandra Laing, born to white parents, was suddenly reclassified as "colored" based on her appearance, getting expelled from school and alienated from her family.
HCE tools:
classification - the gov’s goal was to fit everyone into rigid, unambiguous race boxes to legally enforce segregation
power - entire system was an exercise of asymmetric power; state used its power to alter the behavior of black people
sociotechnical system - the social goal (segregation and white supremacy) was enforced through technical tools (passbooks, centralized databases); passbook (technology) was so intertwined with the actions of people (travel, work, taxes)
identity/positionality - the gov classification dictated how society saw and treated individuals
Reading: Public Data Center
The data primarily provides the "View from Above" by reporting the standardized AQI (Air Quality Index) and showing which industrial facilities are in regulatory compliance.
Users can denaturalize the pollution problem by using the data to connect the abstract air quality number (AQI) to specific, regulated polluters in their local area.
Advocates can leverage the maps to find areas that are disproportionately impacted and use that information to challenge BAAQMD policy or push for local changes.
HCE tools
classification - built on AQI classification system and categorizes facilities into categories of in or out of regulatory compliance
power - reports the official “View from Above”
Case study: Coordinated Entry System and VI-SPDAT scores
Used to create a score to prioritize homeless individuals for housing resources
Caused representational harm because the score was racially biased and unreliable by assigning people of color lower priority
Resulting score was granted high authority, allowing it to easily circulate and automate life-changing decisions about housing access despite its flaws.
HCE tools:
representation - VI-SPDAT score is quantitative representation of an individual’s actual need for housing —> causes representational harm bc score was racially biased and unreliable
power - system concentrates power in the hands of the algorithmic tool and the admin who implements it
identity/positionality - tool used race to determine how society sees and treats them (if they can get housing or not)
sociotechnical system - technical part (algorithm) is intertwined with the social actions (resource allocation)
Case study: Direct-to-consumer (DTC) DNA testing
Sharing your DNA also exposes genetic information about family members who never consented.
Genetic data can be sold or repurposed beyond its original intent, including use by drug companies or law enforcement.
Because DNA is permanent, you lose control over its future use, which can lead to risks like discrimination or denial of services.
HCE tools:
vulnerability - DNA is permanent so consumer and their relatives are exposed to discrimination, denial of services, or identification by law enforcement
agency - loss of agency bc once data is submitted the individual loses control over its future use
identity/positionality - data from DNA test can be used to assign social positionality (ex. person with specific health risk, potential crime suspect) that impacts how society sees and treats them
representation - data is converted into representations (ex. ancestry maps, health markers) made to stand in for a complex biological reality
Reading: Coded Exposure by Ruha Benjamin
Core Idea: New technologies and algorithms often worsen old racial inequalities. Racism isn't just about human bias; it's coded into the systems we use.
Default is Whiteness: Technology is often built around a "White ideal". This leads to two opposite, but equally harmful, problems for non-White people:
Invisibility: Sensors and cameras fail to recognize dark skin.
Hypervisibility: Being over-exposed (ex. facial recognition databases filled disproportionately with Black people).
Systems like facial recognition or predictive policing make certain groups more vulnerable to harm.
Simple Example: Facial recognition software is less accurate at recognizing Black faces, yet the databases they scan are disproportionately filled with Black people, leading to false accusations in policing.
Bias comes from the very start of design, in the training data or in the engineering choices about which facial features the algorithm should focus on. This is not an accident; it's a choice.
For already exposed groups, being "included" in a flawed system (like making facial recognition more accurate) can just mean better surveillance and control, not true freedom or equity.
HCE tools:
identity/positionality - technology is designed to favor white people, determining how society sees and treats non-white people through technology itself
representation - system’s representation of non-white faces is flawed since it’s less accurate at recognizing Black faces (ex. causing false accusations in policing)
power - institutions that use these technologies (like policing) have asymmetric power to alter the behavior of certain groups
sociotechnical system - actions of people (designers who embedded bias) and technologies (facial recognition, predictive policing) are intertwined
Case study: John Graunt’s tables
Political Arithmetic in Action: Graunt was the first to analyze the (weekly London death records) to move beyond simple guesswork, abstracting the messy, anecdotal records into quantifiable, tabular data for state management.
Discovery of Social Regularities: He revealed stable, predictable patterns in human events, such as a consistent ratio of male-to-female births (more boys), regular rates of chronic diseases, and the quantification of high infant mortality. This demonstrated that social phenomena could be governed by statistical laws.
Key Abstraction: Graunt's most significant innovation was creating the first Life Table (Mortality Table), which estimated the probability of survival to different ages. This table provided a core statistical tool for the state to manage population and assess life expectancy.
HCE tools:
classification - Graunt sorted and categorized records into phenomena (causes of death, gender in births, age of death, etc.)
representation - tools Graunt produced i.e. Mortality Table are technological representations made to stand in for the chaotic life and death reality in London
power - Graunt’s statistics provided the state a core tool to manage the population
narrative - Graunt’s work established a scientific narrative that explained how the world is through predictable patterns
Case study: U.S. Census (1790)
Goal of the State: The census was a Constitutional mandate (Political Arithmetic) designed to apportion political power (House seats) and taxes across the states.
Categories of Power: Datafied the population using racial and status categories, i.e. enslaved people as 3/5 of a person.
Defining the Norm: The categories, such as "Free White Males," established a legal and statistical norm for full citizenship and agency in the new nation.
HCE tools:
classification - defined and sorted the population using strict racial and status categories (ex. Free White Males, Slaves, etc.)
power - census was political arithmetic used to divide political power (House seats) and taxes; also assymmetric power by classifying enslaved people as 3/5 a person
identity/positionality - census established the term Free White Males that defined the ideal identity and full citizenship (overall established legal hierarchy)
representation - census data was the official representation of the country’s population
Reading: Nature and Space by James C. Scott
Core Idea: Governments can only manage things (like land or people) by taking a complex world and stripping it down to a few abstract, measurable facts that are legible.
The "Official" World is Fake: The actual reality (like a diverse forest or flexible local customs) is ignored in favor of an abstract simplification that fits a grid.
Example of Simplification: Replacing complex, ecologically rich forests with simple monocultures (one-tree species) to easily calculate timber revenue.
The View from Below is Richer: Local people's knowledge is highly detailed, flexible, and practical (e.g., measuring distance in "rice-cookings"), but because it doesn't fit the state's rigid formulas, it's considered "illegible" and ignored.
HCE tools:
classification - governments manage the complex world by stripping it down to a few abstract, measurable facts that git a rigid structure; replacing forests is classification designed for easy management
power - asymmetric capacity by imposing its simplified view of the world onto the complex reality, ignoring what it can’t measure (ex. forest’s ecology, people’s local customs)
representation - gov’s simplified facts are the official representation of the world
sociotechnical system - technology (abstract models, grids) is intertwined with the social action (state’s goal of management and control)
Case study: Francis Galton
Galton applied the Normal Curve to human traits creating a statistical abstraction of "genetic worth" that he correlated with social class.
Pioneered the concept of regression to measure how the extreme traits of parents tended to become less extreme in their children.
His work established a statistical basis for studying human inheritance, but his core goal was using data to judge and categorize human populations
HCE tools:
classification - Galton judged and categorized human populations to create a statistical hierarchy of genetic worth that was correlated with social class
representation - the Normal Curve was a powerful mathematical representation of human heredity and traits made to stand in for an individual’s actual value or potential
power - Galton established asymmetric power for scientists and the state to define human “worth” and structure policies like in eugenics
performativity - Galton’s work didn’t just describe an existing link btwn statistics and social class but actively created the concept of “genetic worth” as a statistic
Case study: Linear regression and eugenics
Key statistical methods, like linear regression and correlation, were developed by Francis Galton specifically to quantify and predict the heritability of human traits (like intelligence or talent) across generations.
HCE tools:
classification - development of these statistical methods was to create an explicit classification of human populations
power - these statistics provided the state with asymmetric power to manage the makeup of human populations because it turned abstract concepts like “talent” into measurable facts
representation - linear regression and correlation are technological representations made to stand in for complex biological reality
narratives - statistical methods used to advance the narrative of the desirable future of a genetically managed population
Reading: “How Eugenics Shaped Statistics” by Aubrey Clayton
Core Idea: The foundation of modern statistics is not objective. The field's core mathematical concepts were intentionally invented to support eugenics.
Figures like Francis Galton created these mathematical tools specifically to prove the superiority of certain races and classes and justify selective breeding.
Simple Example: Galton created the tools for regression to track how "superior" traits were inherited, to find the best way to "breed" people to improve the human population.
HCE tools:
co-production - eugenics drove creation of tools and tools provided “scientific” legitimacy for eugenics
power - statistics gave power to the proponents of eugenics to structure and alter the behavior of entire populations
classification - statistical tools created a classification system to sort which humans were superior
representation - statistics were technological representations made to stand in for an individual’s genetic worth
Case study: Gapminder.org
An institution that brings together and displays global population data on population health, welfare and size.
Popularized the use of animated bubble charts to make complex global statistics (from the UN/World Bank) intuitive and easy for the public to understand.
Visualization contains narrative of countries moving towards America and Europe’s levels of child mortality and fertility
HCE tools:
representation - Gapminder takes complex global statistics and abstracts them into animated bubble charts that are more intuitive
narratives - Gapminder uses visualization to communicate the narrative that countries are moving towards America’s and Europe’s levels of child mortality and fertility
Reading: “Why are Certain Countries Poor? Dismantling Comparative Models of Development” by Andrew Brooks
Core Idea: The idea that poor nations are "behind" and must follow the upward path of the West is a fantasy.
Metrics are Distorted: Global metrics like the Human Development Index (HDI) can be manipulated or skewed.
Poverty is relational: Poor countries aren’t poor just because of their own problems. They are poor because their lack of development is tied to the wealth of rich countries, which benefit from global trade and keep them in a weaker economic position.
HCE tools:
narratives - reading introduces narrative that the story that poor nations are behind and should follow the West is a fantasy
power - asymmetric power of wealthy nations to maintain the weaker economic position of other populations through global trade
representation - global metrics like the Human Development Index (HDI) are flawed representations
Case study: COMPAS
One kind of algorithmic risk assessment tool in court room
Predicts recidivism rate
Proprietary, developed by the for-profit company Northpointe
Ranks defendants as “low,” “medium,” or “high" risk
HCE tools:
classification - COMPAS sorts and orders humans into discrete categories of “low,” “medium,” or “high” risk
power - COMPAS concentrates power in the algorithm and the company that created it
representation - COMPAS score is a technological representation made to “stand in” for an individual defendant’s future criminality
sociotechnical system - technical part (algorithm) is intertwined with the social actions (courtroom decisions)
Reading: “A Mulching Proposal: Analyzing and Improving and Algorithmic System for Turning the Elderly into High-Nutrient Slurry” by Os Keyes
Satire Setup: Imagines an algorithm that decides which elderly people to turn into “slurry,” highlighting the absurdity of applying ethics frameworks to a fundamentally immoral idea.
Core Idea: Ethical frameworks like FAT (Fairness, Accountability, Transparency) can “improve” a system without questioning whether it should exist at all.
Fairness Example: Bias was “fixed” by making mulching equally likely across all demographic groups — fairness as equal harm.
Accountability Example: Added a meaningless safeguard — a 10-second appeal to a human “death doula.”
Big Point: Data ethics often focuses on implementation (bias, accountability) while ignoring that some ideas are inherently unethical, no matter how fairly they’re executed.
HCE tools:
narratives - satire is a direct attach on the dominant narrative of data ethics that algorithms can always be fixed through frameworks like FAT
power - fictional mulching algorithm represents the ultimate exercise of power (the capacity to decide life and death) highlighting how technical fixes can obscure abuse of power
performativity - the algorithm was performative since the bias was “fixed” by making mulching equally likely across all demographic groups so fairness was represented as equal chance of harm
classification - sorting the elderly into two categories of those who live and those who turn into “slurry”
Case study: UK “Ofqual” controversy
The UK Ofqual Controversy was a major failure of ADM
Used biased data: Relied on the school's past grades (a biased proxy), penalizing high-performing students in improving or disadvantaged schools.
Caused real harm: Resulted in Representational Harm by unfairly downgrading nearly 40% of students' final marks, limiting their agency (power to get into university).
Led to blame shifting: Public protested the "algorithm," but the failure was due to the human decision to use biased data, leaving officials as the moral crumple zone.
HCE tools:
representation - algorithm was meant to be a fair representation of a student’s final exam grade but it unfairly downgraded 40% of the students’ final marks leading to representational harm
agency - downgrading of marks directly limited the students’ agency specifically their power to get into university
identity/positionality - system penalized high-performing students in improving or disadvantaged schools reinforcing pre-existing negative positionality
sociotechnical system - technical part (algorithm) intertwined with the social part (human decision to use biased data) making system’s output inseparable from poor human design choices
Reading: “The Allegheny Algorithm” by Virginia Eubanks
Core Idea: Poverty Profiling. The AFST (Allegheny Family Screening Tool) uses an algorithm to predict child neglect, but it mostly acts as a poverty profiler by treating the use of public services as a sign of risk.
The model only uses data from families who access public services. It doesn't predict actual harm, but only "proxies" (like being reported to the hotline), which are easily manipulated.
Mediation: The algorithm subtly mediates human judgment; managers believe the human should learn from the model when their assessments conflict.
The system targets the poor unfairly. The increased scrutiny and stigma can make families afraid to reach out for help, potentially creating the very neglect the algorithm seeks to prevent.
HCE tools:
classification - sorts and orders families based on a predicted score
power - tool of asymmetric power wielded by the state by using the tool to alter official decisions about families
identity/positionality - system targets the poor unfairly meaning the family’s socioeconomic positionality is used against them
sociotechnical system - technical part (algorithm’s score) is intertwined with the social actions (child welfare screening)
Case study: Unisex insurance debates
The Unisex Insurance Debates were a conflict over actuarial fairness
Insurers argued that charging different premiums based on gender was fair because it matched the group's real risk
Activists fought this, saying it was discrimination because people shouldn't be penalized for things they can't control
Both sides ultimately agreed that the goal should be to use better data to make prices more accurately reflect individual risk
HCE tools:
classification - insurers argued that classifying people based on gender to match statistical risk was a “fair” classification
identity/positionality - insurers used gender identity to assign a specific positionality within the risk pool
Reading: Panopticism by Michel Foucault
Architectural concept for a prison designed by Jeremy Bentham
Uses a central tower to create the feeling of constant, unseen surveillance by the guards
Goal is to force the prisoners to discipline their own behavior and always act as if they are being monitored
HCE tools:
power - central tower designed for asymmetric power of guards to alter the behavior of prisoners through the mere feeling of constant surveillance
agency - eliminates prisoners’ agency by creating the condition where they always act as if they are being monitored so prisoners cannot act spontaneously
sociotechnical system - technology (architectural design) is intertwined with the social goal (prison discipline and behavior modification)