Digital Inequalities Flashcards

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/15

flashcard set

Earn XP

Description and Tags

Flashcards about Digital Inequalities

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

16 Terms

1
New cards

What are the learning outcomes of the lecture on digital inequalities?

Aims to understand how digital technologies may reinforce existing inequalities and consider responses like algorithmic auditing and fairness metrics.

2
New cards

What is the main focus of the lecture regarding digital inequality?

Focuses on unequal outcomes, such as those subject to algorithmic decision-making.

3
New cards

What type of decisions do computer scientists make when dealing with algorithmic systems?

They are engaged in making moral and ethical decisions that impact people’s lives.

4
New cards

What is Data Justice?

The data justice concerns the ways in which big data systems can discriminate, discipline, and control.

5
New cards

What does Algorithmic Social Justice address?

Addresses how AI-driven systems reinforce, mitigate, or reshape social inequalities.

6
New cards

Describe a racial bias example in AI.

AI models being less effective on darker skin tones in skin cancer identification.

7
New cards

Describe a gender bias example in AI.

AI models missing a higher percentage of liver disease cases in women compared to men.

8
New cards

What does the gender gap in accuracy for liver disease AI models reflect?

Gender inequalities in clinical practice.

9
New cards

What is predictive policing?

The use of historic crime data to determine how to allocate police geographically.

10
New cards

What is the problem with predictive policing?

Models predicting future policing patterns more than actual crime.

11
New cards

What is the COMPAS tool?

A tool that gives people who have been arrested a ‘risk score’ that predicts a person’s likelihood to reoffend within 2 years

12
New cards

What did ProPublica find in their investigation of the COMPAS algorithm?

The algorithm was more likely to wrongly label black defendants as high risk and more likely to wrongly label white defendants as low risk.

13
New cards

What is Algorithmic Auditing?

A method to analyze AI models by repeatedly querying them to detect biases or unintended behaviors.

14
New cards

Give an example of expert-led audits methodology in Algorithmic Auditing.

Detecting bias in AI-generated images favoring certain demographics.

15
New cards

Name two example of fairness metrics.

Statistical Parity Difference and Equal Opportunity Difference.

16
New cards

Does the choice of fairness metric matter?

Each fairness metrics have different outcomes.