Lecture 6: Computing for Empathy

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/23

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

24 Terms

1
New cards

what is empathy

Empathy is the ability to understand and share the feelings of another. It includes: Cognitive empathy (understanding another’s perspective), Affective empathy (sharing another’s emotions), and Compassionate empathy (taking action). Empathy is central to relationships and ethics. Machines lack true emotion but can simulate empathy.

2
New cards

can machines feel?

Can we distinguish between an AI recognizing emotions vs. genuinely experiencing them? If an AI seems empathetic, does it matter if it truly "feels"? What are the implications for trust and relationships?

3
New cards

how do we simulate empathy?

Simulation theory: people understand others by simulating emotions. AI designers use this to create chatbots and human-like interactions. But simulated empathy might be mistaken for real empathy.

4
New cards

how AI simulates empathy

NLP (sentiment analysis, emotion detection), computer vision (facial expression recognition using CNNs), voice analysis (tone, pitch, stress), reinforcement learning with human feedback (RLHF), and personalization algorithms based on user behavior.

5
New cards

computing’s early views on emotion

Turing imitation game asked: Can machines convince us they think? Early emotion detection used NLP. Affective computing studies how machines process and simulate emotions. Key question: does recognizing emotion = experiencing it?

6
New cards

games as empathy machines

Narrative-driven games let players experience different perspectives. Emotional immersion and interactive storytelling shape player emotions. Raises question: can games exploit emotions for engagement?

7
New cards

do we need empathy in computing?

Who benefits or is harmed by empathetic AI? Should AI focus on ethical fairness over emotional engagement? Used in healthcare, therapy, customer service, and education—should this be allowed? What are long-term effects?

8
New cards

theory of mind (ToM)

The ability to attribute mental states to others—cognitive perspective taking. Machines can simulate ToM to predict behavior, but face challenges: lack of understanding, privacy concerns, and potential manipulation.

9
New cards

affective computing framework (Picard, 1997)

Machines that recognize, interpret, and simulate emotions using facial, voice, and physiological cues. Applied in chatbots, customer service, and mental health. Criticized for bias and ethical concerns like emotional surveillance.

10
New cards

empathy altruism hypothesis (Batson, 1991)

Empathy motivates altruistic behavior. Applied in philanthropic AI and social robots. Criticized since AI lacks true empathy and may exploit emotional responses.

11
New cards

ethical frameworks for AI empathy

Utilitarianism vs. deontological ethics: Should AI actions be based on outcomes or moral duties? Key to designing ethical empathetic AI systems.

12
New cards

algorithmic bias in emotion recognition

AI may misinterpret non-Western emotions due to dataset bias. Gender and racial biases also exist. AI can misread sarcasm and mental health cues. Raises privacy concerns: should AI analyze emotions without consent?

13
New cards

the case against empathy (Paul Bloom)

Empathy is biased—we feel more for people like us. Emotional empathy can impair moral decisions. Advocates for rational compassion and emotionally neutral AI.

14
New cards

case study: Microsoft Tay Chatbot (2016)

Tay learned from Twitter and became offensive due to unsupervised learning. Shows how AI can amplify harmful human biases without safeguards.

15
New cards

case study: Amazon Hiring AI (2018)

AI discriminated against women due to biased historical hiring data. Shows that empathetic AI can reinforce systemic discrimination if trained on biased inputs.

16
New cards

can AI learn to care?

Ethics of care (Carol Gilligan): morality through relationships and responsibility. Applied to carebots and emotional support AIs. But can real care exist without true emotion?

17
New cards

the limits of AI empathy

AI doesn't feel; it predicts. Ethics of care asks if relationships require real emotion. Should AI offer therapy or support? Raises questions about replacing emotional labor.

18
New cards

art as a medium for empathy

Includes VR experiences, data visualizations, and installations showing emotional contrast (e.g., smooth tech vs. rough e-waste work). Art fosters empathy by reflecting disparities in tech experiences.

19
New cards

do we need empathy in computing (revisited)

Rosalind Picard: Empathy enhances AI interactions in healthcare, education, etc. Paul Bloom: Empathy is irrational—AI should be guided by logic and fairness.

20
New cards

empathy vs rationality

Should AI mimic empathy? Risks include over-reliance and emotional manipulation. Can rationality and empathy coexist in design?

21
New cards

ai and empathy: the debate

Pros: Better mental health, education, and customer service. Cons: Risk of manipulation, loss of human connection, and ethical concerns over emotion surveillance.

22
New cards

replika as a friend

Replika is a chatbot acting as a companion. Users report emotional bonds. Raises ethical questions about emotional dependence and manipulation.

23
New cards

human replacement debate

If AI performs emotional labor, should it? Should AI make emotional decisions? What safeguards are needed to prevent manipulation?

24
New cards

future directions in ai and empathy

Includes better bias detection, multimodal emotion AI, personalized AI therapy, emotional regulation laws (e.g., EU AI Act). Raises questions: Can AI truly feel? Should AI use emotions to persuade?