Sensor Fusion and Machine Learning for Seated Movement Detection With Trunk Orthosis
Advanced assistive devices developed for activities of daily living use machine learning (ML) for motion intention detection using wearable sensors. Trunk assistive devices provide safety, balance, and independence for wheelchair users individuals who spend prolonged hours in sitting positions. We u...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.41676-41687 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Advanced assistive devices developed for activities of daily living use machine learning (ML) for motion intention detection using wearable sensors. Trunk assistive devices provide safety, balance, and independence for wheelchair users individuals who spend prolonged hours in sitting positions. We used ML for trunk movement intention detection with a trunk orthosis. Sensor fusion technique with four electromyography (EMG) and one inertial measurement unit (IMU) sensor signals are used to develop a three-level classification system. Forty participants engaged in seated trunk movement trials wearing the orthosis. The trials comprised 30 movements involving trunk flexion/extension, lateral bending, and axial rotation. The wrapper method was used to reduce essential EMG features. Ensemble (ES), k-nearest neighbors (KNN), and support vector machine ML classifiers were used. Twenty-six features (five EMG for each of four muscles and six for IMU) were used to develop ten individual ML models, resulting in an average accuracy of 95.44%. Eight models achieved the highest accuracy with the ES and two with KNN. The models were then cascaded to form a trunk motion detection system that achieved a test accuracy of 87.0%. The promising result of this study can be implemented for trunk motion recognition with active trunk orthosis. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3377111 |