Touch-text answer for human-robot interaction via supervised adversarial learning

In daily life, touch modality plays an important role in conveying human intentions and emotions. To further improve touch-based human-robot interaction, robots need to infer human emotions from touch signals and respond accordingly. Therefore, it is a major challenge to correlate the emotional stat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2024-05, Vol.242, p.122738, Article 122738
Hauptverfasser: Wang, Ya-Xin, Meng, Qing-Hao, Li, Yun-Kai, Hou, Hui-Rang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In daily life, touch modality plays an important role in conveying human intentions and emotions. To further improve touch-based human-robot interaction, robots need to infer human emotions from touch signals and respond accordingly. Therefore, it is a major challenge to correlate the emotional state of touch gestures with text responses. At present, there are few researches on touch-text dialogue, and robots cannot respond to human tactile gestures with appropriate text, so touch-text-based human-robot interaction is not yet possible. To solve these problems, we first built a touch-text dialogue (TTD) corpus based on six basic emotions through experiments, which contains 1109 touch-text sample pairs. And then, we designed a supervised adversarial learning for touch-text answer (SATTA) model to realize the touch-text based human-robot interaction. The SATTA model correlates the data of text mode with that of touch mode by reducing the emotion discrimination loss in the public space and the feature difference between the sample pairs of two modes. At the same time, the feature representation is mapped into the label space to reduce the classification loss of samples. The experiment in the TTD corpus validates the proposed method.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2023.122738