Miscommunication Detection and Recovery in Situated Human–Robot Dialogue

Even without speech recognition errors, robots may face difficulties interpreting natural-language instructions. We present a method for robustly handling miscommunication between people and robots in task-oriented spoken dialogue. This capability is implemented in TeamTalk, a conversational interfa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on interactive intelligent systems 2019-02, Vol.9 (1), p.1-40
Hauptverfasser: Marge, Matthew, Rudnicky, Alexander I.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Even without speech recognition errors, robots may face difficulties interpreting natural-language instructions. We present a method for robustly handling miscommunication between people and robots in task-oriented spoken dialogue. This capability is implemented in TeamTalk, a conversational interface to robots that supports detection and recovery from the situated grounding problems of referential ambiguity and impossible actions. We introduce a representation that detects these problems and a nearest-neighbor learning algorithm that selects recovery strategies for a virtual robot. When the robot encounters a grounding problem, it looks back on its interaction history to consider how it resolved similar situations. The learning method is trained initially on crowdsourced data but is then supplemented by interactions from a longitudinal user study in which six participants performed navigation tasks with the robot. We compare results collected using a general model to user-specific models and find that user-specific models perform best on measures of dialogue efficiency, while the general model yields the highest agreement with human judges. Our overall contribution is a novel approach to detecting and recovering from miscommunication in dialogue by including situated context, namely, information from a robot’s path planner and surroundings.
ISSN:2160-6455
2160-6463
DOI:10.1145/3237189