Hand Gesture Recognition From Wrist-Worn Camera for Human-Machine Interaction

In this work, we study the ability to use hand gestures for human-machine interaction from wrist-worn sensors. Towards this goal, we design a wrist-worn prototype to capture RGB video stream of hand gestures. Then we built a new wrist-worn gesture dataset (named WiGes) with various subjects in inter...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2023, Vol.11, p.53262-53274
Hauptverfasser: Nguyen, Hong-Quan, Le, Trung-Hieu, Tran, Trung-Kien, Tran, Hoang-Nhat, Tran, Thanh-Hai, Le, Thi-Lan, Vu, Hai, Pham, Cuong, Nguyen, Thanh Phuong, Nguyen, Huu Thanh
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this work, we study the ability to use hand gestures for human-machine interaction from wrist-worn sensors. Towards this goal, we design a wrist-worn prototype to capture RGB video stream of hand gestures. Then we built a new wrist-worn gesture dataset (named WiGes) with various subjects in interaction with home appliances in different environments. To the best of our knowledge, this is the first benchmark released for studying hand gestures from a wrist-worn camera. We then evaluate various CNN models for vision-based recognition. Furthermore, we deeply analyze the models that produce the best trade-off between accuracy, memory requirement, and computational cost. We point out that among studied architectures, MoviNet produces the highest accuracy. Then, we introduce a new MoviNet-based two-stream architecture that takes both RGB and optical flow into account. Our proposed architecture increases the Top-1 accuracy by 1.36% and 3.67% according to two evaluation protocols. Our dataset, baselines, and proposed model analysis give instructive recommendations for human-machine interaction using hand-held devices.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3279845