Heterogeneous computing model for post‐injury walking pattern restoration and postural stability rehabilitation exercise recognition
The research paper presents the heterogeneous computing model for analysis & restoration of human walking deformity and posture instability. Gait‐related walking activities are very important for the analysis of postural instability, repairment of gait abnormality, diagnosis of cognitive declina...
Gespeichert in:
Veröffentlicht in: | Expert systems 2022-07, Vol.39 (6), p.n/a |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The research paper presents the heterogeneous computing model for analysis & restoration of human walking deformity and posture instability. Gait‐related walking activities are very important for the analysis of postural instability, repairment of gait abnormality, diagnosis of cognitive declination, enhance the cognitive ability of human‐centered humanoid robot system, and many clinical diagnoses, for example, Parkinson, pathological gait, freezing of gait, etc. at an early stage. For experiment analysis, 10 different lower limb activities are being considered of healthy and crouch walking subjects. A total of 25 healthy and 10 crouch walk subjects are considered for experiment purposes of different age groups, sex, and mental status. To achieve this objective the pattern of 10 different rehabilitation activities are captured using RGB‐Depth (RGB‐D) camera and classified using heterogeneous deep learning models. Different deep learning models Convolutional Neural Network (CNN) and CNN‐LSTM (CNN‐Long Short Term Memory) are used for the classification of these rehabilitation exercises. The RGB‐D data is obtained using a Microsoft Kinect v2 sensor on a 100 Hz sampling frequency. Experimental results have shown significant activity recognition accuracy with 96% and 98% for CNN and CNN‐LSTM models respectively. |
---|---|
ISSN: | 0266-4720 1468-0394 |
DOI: | 10.1111/exsy.12706 |