A novel muscle-computer interface for hand gesture recognition using depth vision

Muscle computer Interface (muCI), one of the widespread human-computer interfaces, has been widely adopted for the identification of hand gestures by using the electrical activity of muscles. Although multi-modal theory and machine learning algorithms have made enormous progress in muCI over the las...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of ambient intelligence and humanized computing 2020-11, Vol.11 (11), p.5569-5580
Hauptverfasser: Zhou, Xuanyi, Qi, Wen, Ovur, Salih Ertug, Zhang, Longbin, Hu, Yingbai, Su, Hang, Ferrigno, Giancarlo, De Momi, Elena
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Muscle computer Interface (muCI), one of the widespread human-computer interfaces, has been widely adopted for the identification of hand gestures by using the electrical activity of muscles. Although multi-modal theory and machine learning algorithms have made enormous progress in muCI over the last decades, the processing of the collecting and labeling large data sets creates a high workload and leads to time-consuming implementations. In this paper, a novel muCI was developed to integrate the advantages of EMG signals and depth vision, which could be used to automatically label the cluster of EMG data collected using depth vision. A three layers hierarchical k-medoids approach was designed to extract and label the clustering feature of ten hand gestures. A multi-class linear discriminant analysis algorithm was applied to build the hand gesture classifier. The results showed that the proposed algorithm had high accuracy and the muCI performed well, which could automatically label the hand gesture in all experiments. The proposed muCI can be utilized for hand gesture recognition without labeling the data in advance and has the potential for robot manipulation and virtual reality applications.
ISSN:1868-5137
1868-5145
1868-5145
DOI:10.1007/s12652-020-01913-3