A Fast-Response Dynamic-Static Parallel Attention GCN Network for Body-Hand Gesture Recognition in HRI
Human-robot interaction (HRI) systems are crucial in robotics, natural, fast-response, and multimodal are the future trends in their development. However, current interaction methods have the following flaws: 1) slow response of action recognition algorithms for generic scenes, especially at the beg...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on industrial electronics (1982) 2024-06, Vol.71 (6), p.1-12 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Human-robot interaction (HRI) systems are crucial in robotics, natural, fast-response, and multimodal are the future trends in their development. However, current interaction methods have the following flaws: 1) slow response of action recognition algorithms for generic scenes, especially at the beginning stage; 2) insufficient feature extraction and fusion capabilities for spatiotemporal graph data; and 3) no good paradigm of body-hand recognition in HRI. To overcome these bottlenecks, we propose a fast-response graph convolutional network (GCN) for body-hand gesture recognition. First, we propose a dynamic-static parallel network for dynamic body gestures that is responsive and accurate. Second, we propose a spatiotemporal graph attention module to improve the graph data fusion effect in the dynamic-static network. Third, we implement a complete command module to form complete commands with body and hand information for interactions and control of the robot. Finally, extensive experiments on four datasets and real-world experiments were conducted to demonstrate that our network is capable of fast response and accurate recognition of dynamic body gestures at the beginning stage, verifying the effectiveness of skeleton-based body-hand gesture recognition, with a clear advantage over the state-of-the-art. |
---|---|
ISSN: | 0278-0046 1557-9948 |
DOI: | 10.1109/TIE.2023.3299012 |