Deep Learning-Based Upper Limb Functional Assessment Using a Single Kinect v2 Sensor

We develop a deep learning refined kinematic model for accurately assessing upper limb joint angles using a single Kinect v2 sensor. We train a long short-term memory recurrent neural network using a supervised machine learning architecture to compensate for the systematic error of the Kinect kinema...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2020-03, Vol.20 (7), p.1903
Hauptverfasser: Ma, Ye, Liu, Dongwei, Cai, Laisi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We develop a deep learning refined kinematic model for accurately assessing upper limb joint angles using a single Kinect v2 sensor. We train a long short-term memory recurrent neural network using a supervised machine learning architecture to compensate for the systematic error of the Kinect kinematic model, taking a marker-based three-dimensional motion capture system (3DMC) as the golden standard. A series of upper limb functional task experiments were conducted, namely hand to the contralateral shoulder, hand to mouth or drinking, combing hair, and hand to back pocket. Our deep learning-based model significantly improves the performance of a single Kinect v2 sensor for all investigated upper limb joint angles across all functional tasks. Using a single Kinect v2 sensor, our deep learning-based model could measure shoulder and elbow flexion/extension waveforms with mean CMCs >0.93 for all tasks, shoulder adduction/abduction, and internal/external rotation waveforms with mean CMCs >0.8 for most of the tasks. The mean deviations of angles at the point of target achieved and range of motion are under 5° for all investigated joint angles during all functional tasks. Compared with the 3DMC, our presented system is easier to operate and needs less laboratory space.
ISSN:1424-8220
1424-8220
DOI:10.3390/s20071903