Bias in Computer Systems Notes
Bias in Computer Systems: An Overview
Introduction
Bias in computer systems is a significant issue that has been observed in various real-world applications.
Airline Reservation Systems (Sabre and Apollo):
- In the 1980s, allegations of anticompetitive practices were brought against American and United Airlines due to biases in their reservation systems (Sabre and Apollo).
- These systems favored "on-line" flights (flights with all segments on a single carrier).
- This systematically disadvantaged international carriers or internal carriers, who operated fewer segments, as highlighted by Fotos (1988) and Ott (1988).
- Interface design compounded the bias by displaying only a few options per screen, with 90% of tickets booked from the first screen display (Taib, 1990).
Impact of Biased Systems:
- Biased systems can have widespread impact because they are inexpensive to disseminate.
- Complexity of systems can hide biases within the code.
- Unlike biased individuals, biased systems offer no means for appeal.
Categories of Bias
- Three categories of bias in computer systems:
- Preexisting bias
- Technical bias
- Emergent bias
Defining Bias
- Bias is defined as computer systems that systematically and unfairly discriminate against certain individuals or groups in favor of others.
- Unfair discrimination involves denying opportunities or assigning undesirable outcomes on unreasonable or inappropriate grounds.
- Example: Credit Advisor:
- Denying credit based on poor payment records is not bias.
- Denying credit based on ethnic surnames is unfair discrimination and, therefore, a sign of bias.
Key Considerations:
- Unfair discrimination must occur systematically to be considered bias.
- Systematic discrimination must be joined with an unfair outcome to establish bias.
- Example: Patriot Missile System:
- A software error in the Patriot missile system led to inaccurate calculations and deaths during the Persian Gulf War (Gao, 1992).
- While a systematic error, it doesn't involve unfairness, thus not considered bias in this framework.
Framework for Analyzing Bias
Methodology
- The framework was developed by examining actual computer systems, identifying instances of bias, and categorizing them by source.
- 17 computer systems from diverse fields were examined.
Preexisting Bias
- Rooted in social institutions, practices, and attitudes.
- Can enter a system explicitly/consciously or implicitly/unconsciously.
- Example: Automated Loan Advisor:
- Negatively weighting applicants from "undesirable" locations (red-lining) embeds biases based on group stereotypes.
Technical Bias
- Arises from technical constraints or considerations in the design process.
- Sources include:
- Limitations of computer tools.
- Decontextualized algorithms.
- Imperfections in pseudorandom number generation.
- Formalization of human constructs.
- Example: Airline Reservation Systems (Sabre and Apollo):
- The monitor screen size constrained the number of flight options displayed, making the ranking algorithm critically important.
Emergent Bias
- Arises in the context of use, typically after the design is completed.
- Results from changing societal knowledge, population, or cultural values.
- User interfaces are prone to emergent bias.
- Example: Automated Airline Reservation System:
- A system designed for national airlines may disadvantage international airlines due to flight-ranking algorithms.
Applications of the Framework
The National Resident Match Program (NRMP)
- Centralized method for assigning medical school graduates to hospital programs.
- Criticism 1: Favoring Hospital Programs:
- The Admissions Algorithm systematically favors hospital programs over medical students in cases of conflict (Graettinger and Peranson, 1981b; Roth, 1984; Sudarshan and Zisook, 1981; Williams et al., 1981).
- This preference duplicates what happens in an actual admissions process without computerized matching, embodying a preexisting bias.
- Criticism 2: Bias Against Married Couples:
- Earlier versions of the NRMP were biased against married couples, as the Admissions Algorithm placed them at a disadvantage compared to single peers (Roth, 1984; 1990).
- This bias emerged when more married couples participated in the match process, highlighting emergent bias.
- Criticism 3: Bias Against Rural Hospitals:
- Urban hospitals are more successful in filling positions than rural ones (Roth, 1984; Sudarshan and Zisook, 1981).
- This is not necessarily a bias, because it reflects the preferences of match participants.
- The NRMP exemplifies how centralized computing systems can hold users hostage to biases embedded within the system.
A Multilevel Scheduling Algorithm (MLSA)
- MLSA balances response time and computation speed in timeshare computer systems.
- It gives processing attention to new commands as quickly as possible.
- Long-running processes could wait all day to finish.
- Individuals with long-running programs are systematically disadvantaged.
- Users developed counterstrategies to run long-running tasks in small chunks.
- The MLSA violates the fairness preserved in the "first-come first-served" strategy.
- The MSLA bias is considered technical, as the algorithm arose in the attempt to satisfy a difficult technical requirement to allocate a scarce resource.
The British Nationality Act Program (BNAP)
- Translates the British Nationality Act into a computer program.
- Criticism 1: Gender Bias:
- The Act is biased against the illegitimate descendants of British men, as "a man is the ‘father’ of only his legitimate children, whereas a woman is the ‘mother’ of all her children, legitimate or not" (Sergot et al., 1986, p. 375).
- The BNAP embodies preexisting bias by accurately representing the British Nationality Act.
- Criticism 2 & 3: Emergent Bias
- The system was designed in a research environment, among people with sophisticated knowledge of immigration law (expert). Its users, however, are likely to be at best paralegal or immigration counselors in Britain, if not lay persons in foreign countries with only limited access to British legal expertise (nonexpert).
- No mechanism to incorporate relevant case law as it came into being [Sergot et al. 1986].
- If the accumulation of case law lead to change, BNAP would systematically misinform members.
Considerations for Minimizing Bias
- Remedying bias involves identifying it and developing methods of avoiding/correcting it.
- If unfairness can be established in the system’s systematic discrimination, then the charge of bias follows.
- The presence of bias is an aspect of a system in use, not inherent in the system itself.
Strategies for Minimizing Bias
- Preexisting Bias:
- Designers must scrutinize design specifications and understand relevant biases.
- Consider common biases related to cultural identity, class, gender, literacy, handedness, and physical disabilities.
- Include representative individuals in field test groups.
- Technical Bias:
- Designers must envision the design, algorithms, and interfaces in use.
- Weigh considerations of ease of access against equity of access.
- Emergent Bias:
- Designers should anticipate probable contexts of use.
- Articulate constraints on the appropriate contexts of a system’s use.
- Communicate the perspectives and audience assumed in the design.
- System designers and administrators can take responsible action if bias emerges with changes in context.
Designer's Responsibility
- Designers should find support from their professional community to take an effective stand against biased systems.
- The computing community must recognize bias as a feature of computer systems that is worth addressing and minimizing.
- Bias-free system design forms one part of a movement to a more equitable society.
Conclusion
- Freedom from bias should be counted among the select set of criteria according to which the quality of systems in use in society should be judged.
- Concern with bias in system design and experience with these methods can be integrated with other software engineering methods as part of the standard for a computer science curriculum.
- As a community, we must hold our designs accountable to a reasonable degree of freedom from bias against which negligence can be judged.