SiR - Lecture 4

0.0(0)
studied byStudied by 0 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/35

flashcard set

Earn XP

Description and Tags

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

36 Terms

1
New cards

What is Dialogue Management (DM)?

System that tracks the conversation and decides the robot’s next action.

2
New cards

DM Pipeline (order)

ASR → NLU → DM → NLG → TTS
Speech → Text → Decide → Generate → Speak.

3
New cards

Why is DM difficult? (list)

  • Situated grounding

  • Confirmations

  • Clarifications

  • Repairs

  • Repetitions

  • Common sense

4
New cards

Situated grounding (definition)

Interpreting language in a real physical context. Example: “Grab the blue cup next to the red block.”

5
New cards

Repair (definition)

Fixing misunderstandings.
Example: “We meet at 6—no, 7:30.”

6
New cards

Four functions of DM (Traum & Larsson, 2003)

  • Update context

  • Provide expectations for interpretation

  • Coordinate with other modules

  • Decide what to say/do next

7
New cards

Handcrafted DM (definition)

Human-designed rules or structures (no learning).
Pros: control.
Cons: rigid.

8
New cards

Rule-Based Approach

Uses if–then rules, no context memory.
Simple but inflexible.

9
New cards

Finite-State Machines (FSMs)

Fixed states + transitions.
Good for simple tasks.
Bad at handling topic shifts.

10
New cards

Frame-Based Approach

Uses slots (values) instead of fixed structure.
Can fill multiple slots at once.
More flexible than FSM.

11
New cards

Model-Based Approach

Adds user model + context model.
Robot uses past info to choose next action.
More adaptive but complex.

12
New cards

Statistical (ML) Approaches

DM learned from data, not rules.
More adaptable.

13
New cards

Corpus Creation (definition)

Collecting human–human or human–robot dialogue data to train ML models.

14
New cards

Example-Based Approach (definition)

Finds similar past dialogues and copies best answer.
Uses cosine similarity.

15
New cards

Cosine Similarity (definition)

Measures how similar two text vectors are.

  • “hello” → [0.2, 0.1, 0.7]

  • “hi” → [0.21, 0.09, 0.69]

16
New cards

MDP-Based DM (definition)

Uses Markov Decision Processes (S, A, T, R) + Reinforcement Learning. Learns optimal responses from rewards.

17
New cards

POMDP (definition)

Partially Observable MDP — used when the robot cannot see the full conversation state (e.g., uncertain user intent).

18
New cards

End-to-End Neural DM (definition)

Neural networks that generate responses word-by-word (e.g., LLMs).
Pros: flexible.
Cons: no control, needs lots of data.

19
New cards

Hybrid DM Approach (definition)

Combines handcrafted structure + ML learning.
Balance between control and adaptability.

20
New cards

When to use handcrafted DM?

When you need control, safety, or have little data.

21
New cards

When to use ML-based DM?

When you have a lot of data and need flexibility.

22
New cards

When to use hybrid DM?

When you want control + some learning, or have medium data.

23
New cards

Evaluation — Subjective metrics

User satisfaction, naturalness, trust, Godspeed, SSI.

24
New cards

Evaluation — Objective metrics

Task success rate, # dialogue turns, completion time.

25
New cards

Conversational Analysis — Turn-taking

Managing who speaks when.

26
New cards

Adjacency Pairs (definition & examples)

Paired social actions: greeting–greeting, question–answer, offer–accept/deny.

27
New cards

Sequence Organization (definition)

How conversations are structured logically.

28
New cards

Repair (in CA) (definition)

Fixing breakdowns in communication.

29
New cards

Common Ground (definition)

Shared understanding built up during conversation.

30
New cards

Conversational UX Design (definition)

Using conversational patterns to design natural, reusable dialogues.

31
New cards

IECR Framework (Intent–Entity–Context Recognition)

Intent = what user wants
Entity = details
Context = situation information

Example: 

“I want vegetarian pasta with onions.”


Intent = AddPreferences
Entities = vegetarian, onions
Context = meal planning

32
New cards

Novelty Effect (definition)

People like robots more at first simply because they’re new.

33
New cards

Robots in the Wild (definition)

Testing robots in real-life settings to see true user behavior when novelty fades.

34
New cards

Capability Communication (definition)

Robot explaining what it can and cannot do to reduce user uncertainty.

35
New cards

Why capability communication matters

Builds trust, reduces confusion, aligns expectations.

36
New cards

Communication styles (list 3)

  • Baseline: only repeats when confused

  • Reactive: explains when issues occur

  • Proactive: introduces capabilities before user needs them