The LET Procedure for Gesture Recognition With Multiple Forearm Angles

A-mode ultrasound, like other biological signals, has a certain deviation in the signals obtained by performing the same gesture at different arm positions. This problem hinders the clinical application of gesture recognition based on A-mode ultrasound. To tackle this problem, we propose the linearl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2022-07, Vol.22 (13), p.13226-13233
Hauptverfasser: Cai, Shaoxiong, Lu, Zongxing, Guo, Lin, Qing, Zengyu, Yao, Ligang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A-mode ultrasound, like other biological signals, has a certain deviation in the signals obtained by performing the same gesture at different arm positions. This problem hinders the clinical application of gesture recognition based on A-mode ultrasound. To tackle this problem, we propose the linearly enhanced training (LET) procedure to compensate for the deviation of gesture signals after forearm position changes. The training set does not contain the gesture data of the new position, so no additional training is required. Instead, we determine the scale parameters to construct enhanced features for the new positions by the original position gesture features. We tested the method on 10 gestures after the forearm angle is changed. Results show that the classification accuracy can be improved by 7.8% and 9.4% after the forearm bent and stretched 40° respectively. Since the LET procedure is a step between feature extraction and model construction, it is suitable for various features and algorithms, offering a multi-scene solution based on wearable A-mode ultrasound.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2022.3177475