1/48
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
What is a social robot?
A robot designed for social interaction, combining autonomy + communication abilities.
Cultural origins of social robots
Inspired by myths, automata, and sci-fi, shaping expectations of companionship.
Technological origins of social robots
Developed through advances in AI, robotics, sensors, and machine learning.
Anthropomorphism
Humans naturally attribute human traits to non-human things (incl. robots). For example: talking to your robot like a person, naming your Roomba, thinking your car is “angry,” or feeling bad for a robot that falls over
Social Robot Paradox
Humans respond socially to robots, but robots lack real social competence.
Uncanny Valley
Robots that look almost human but not fully → cause discomfort.

Core capability: Multimodal Communication
Robots communicate via voice, gesture, gaze, lights, sound.
Core capability: Affective Expression
Robots show emotions (smile, tone, lights) to improve interaction.
Core capability: Personality Traits
Robots maintain a consistent style (friendly, calm, energetic).
Core capability: Learning & Social Modeling
Robots adapt based on feedback and observe human behavior.
Core capability: Relationship Management
Robots handle long-term interactions: memory, continuity, trust.
Social Capability Spectrum (from simple → advanced…)
Socially evocative
Communication robots
Affective robots
Learning robots
Socially competent robot
Shared attention (meaning)
Human and robot focus on the same thing → increases social connection.
Cognitive modeling
Robot predicts human actions/intents using basic Theory of Mind.
Appearance types: Functional
Looks like a tool; design focused on utility.
Appearance types: Artifact-shaped
Looks like objects (lamps, toys) that talk or express.
Appearance types: Bio-inspired
Humanoid or animal-like (zoomorphism); follows living-being design cues.
Proxemics
Social rules about personal space → robots must respect distances.
Context awareness
Robot adapts behavior to location, culture, situation.
Relational role: Robot for you
Robot acts as a tool/helper.
Relational role: Robot as you
Robot acts as human’s proxy (= something or someone that does a task on your behalf)
Relational role: Robot with you
Robot is a teammate/collaborator.
Relational role: Robot around you
Robot shares space but doesn’t collaborate.
Relational role: Robot as if you
Robot mimics human behavior.
Relational role: Robot part of you
Robot becomes a body extension (prosthetics, exoskeleton).
Human-centered design
Design focuses on user needs, comfort, trust.
Robot-centered design
Design focuses on robot performance, humans adapt to robot.
Social perception
Robot recognizes human faces, gestures, gaze, emotion.
Intelligence (definition)
Ability to achieve goals under uncertainty.
Autonomy (definition)
Robot can act independently without external control.
Measuring autonomy
Through perception, planning, learning, action execution, modeling.
3 Proximity types Interaction
Physical, co-located, remote.
Temporal interaction dimensions
Timespan, duration, frequency.
Verbal communication: Generation
Robot produces content + delivery (tone, timing, gesture sync).
Verbal communication: Recognition
Robot uses semantic parsing, grounding, multimodal cues.
Kinesics
Body movements: pointing, nodding, regulating turn-taking.
Gaze functions
Shows attention, regulates turns, conveys attitudes.
Affective computing
Detecting + responding to human emotion.
Arousal-Valence Model
Emotions described by energy level + positivity/negativity.
Theory of Mind (ToM)
Understanding what others think, feel, or intend. Helps robots predict what humans will do next.
Legible motion
Robot moves in ways humans easily understand (clear intent).
Intention recognition
Robot predicts what humans will do next.
Handover collaboration
Robot safely and clearly exchanges objects with humans.
Socially Aware Navigation
Robots move respecting intentions, norms, comfort zones.
Socially Guided ML: Demonstrations
Robot learns by watching humans.
Socially Guided ML: Social cues
Robot uses gaze, feedback, gestures to improve learning.
Socially Guided ML: Scaffolding
Human teaches robot step-by-step.
Socially Guided ML: Transparency
Robot shows its learning state.
Multimodal integration
Robot fuses speech, gesture, gaze, emotion to act socially