Data Ethics

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/13

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 4:57 PM on 4/9/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

14 Terms

1
New cards

Psych in the era of big tech

  • Research →. big data

  • Education → genAI

  • Clinical Practice → MH apps, teletherapy, digital phenotyping

Psychologists helped build these systems, laid foundations with work that sought to quantify personality, predict behavior, classify individuals, identify stable traits, combined with big tech firms leads to security problems

2
New cards

Why is data so valuable

  • Predicts behavior

  • Infers personality traits, health issues, family history, etc.

  • Model preferences and vulnerabilities

  • Predictive data can be sold to:

    • Advertisers, political campaigns, insurance companies, data brokers

3
New cards

Cambridge Analytica

  • Psychologist marketed app as a personality quiz for research

  • Data was collected (87M people)

  • Sold to cambridge analytica, who used research to build psychological profiles

  • Politicians hired group to influence elections with targeted ads

4
New cards

Problems with MH apps: Cerebral

  • Telehealth startup

  • Shares private health information of more than 3.1M US users with advertisers (e.g., facebook, google, tiktok)

  • Name, phone #, email, DOB, IP address, mental health assessments

  • Treatments/insurance information

  • Shared in real time, users have no idea they are opting in to tracking

5
New cards

Problems with MH apps: Betterhelp

  • 7.8M was paid to users as part of a settlement

  • Assured customers their data wouldn’t be shared except for providing counselling

  • Shared emails, IP addresses, health questionnaire responses with facebook, snapchat, pinterest

6
New cards

Digital Phenotyping

  • The moment-by-moment measurement of behaviour using personal digital devices (e.g., mental health apps)

  • Device data can help predict a person’s mental state for diagnosis

→ Physical activity (e.g., walking), Sleep patterns, Communication with others, Social media use, Time spent viewing or reading content

  • 90% accuracy predicting depression/anxiety, postpartum depression

7
New cards

Potential benefits to phenotyping

  • Earlier intervention, continuous monitoring, increased access, personalized treatment

8
New cards

Do MH apps work?

  • NO!

  • Show mild results, or no meaningful change

  • Some make suicidal ideation, self-injury, and drinking behaviors worse

9
New cards

Harms of data sharing

  • Data breaches

  • Doxxing

  • Ransomeware attacks

  • Denied insurance, adjust loan rates

  • Predict risk

  • Deny jobs

  • No safeguards for how police weaponize this data against users

  • Illegal government surveillance, weaponized against marginalized groups

10
New cards

Why is MH data uniquely sensitive

  • May include trauma, substance use, sexual orientation, affairs, medication histories

  • Creates stigma, changes criminal proceedings, loss of privacy, breaks up family, destroys reputation, change in insurance

11
New cards

Environmental harms

  1. Lithium and rare earth mining

  • Lithium mining due to increase demand for battery operate tech (Nevada, south america)

  • Contaminating ground water for 300 years; acid baths the size of lakes

  1. Indigenous exploitation

  • Rush from tech companies to purchase lithium rich lands, some of which are home to indigenous peoples

  • Chile — Company created deals with atacama peoples to mine the deposits on their ancestral land, only recieve 9-60K a year compared to 250M of company

  1. Legitimizes geopolitical violence

  • Rwanda, Bolivia, etc. supply tin, tantalum, tungsten, gold, often sourced from quarries by groups using child labor

  • Destroys coral reef, mangrove forest, 100+ deaths a year

  • Intel, Apple, Dell, Philips purchase these materials despite knowledge of child labor, human rights, environmental issues

  1. Massive electricity/water use

  • Companies like google and OpenAI

12
New cards

What should psychologists do to help data ethics?

  • Understand psychologists are not neutral

  • Pressure companies and universities

  • Advocate for regulation

  • Refuse exploitative partnerships

13
New cards

Why are research ethics board policy changes not enough?

Ethical change must also occur at:

  • Institutional level

  • Corporate level

  • Regulatory level

  • Academic culture level

14
New cards

How to increase digital privacy

  • Use privacy-focused browsers and search engines

  • Limit tracking (turn off auto-play, predictive text, location history)

  • Use VPN or Tor, especially on public WiFi

  • Read cookie policies and opt out of data collection

  • Remove unnecessary browser extensions and avoid linking apps to platforms

  • Use strong, unique passwords and multi-factor authentication

  • Accept software updates and use ad-blockers

  • Regularly check privacy settings on devices and smart technology

  • Delete personal data before disposing of devices