Human activity recognition adapted to the type of movement
•An analysis of wearable sensor data shows relevant differences between movements.•A novel human movement classification based on motion characteristics is proposed.•Input-data formats and window sizes are evaluated depending on the type of activity.•The most appropriate signal processing is used fo...
Gespeichert in:
Veröffentlicht in: | Computers & electrical engineering 2020-12, Vol.88, p.106822, Article 106822 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •An analysis of wearable sensor data shows relevant differences between movements.•A novel human movement classification based on motion characteristics is proposed.•Input-data formats and window sizes are evaluated depending on the type of activity.•The most appropriate signal processing is used for each type of movement.•The most appropriate deep learning architecture is used for each type of movement.
This paper analyzes the main motion characteristics of several types of movement using wearable sensor data. Based on this analysis, different deep learning-based strategies were evaluated to select the best alternative for separating movements within the same type. These strategies included Convolutional and Recurrent Neural Networks for feature learning and classification. Recordings from Pamap2 and Opportunity datasets were used to evaluate the alternatives considering a subject-wise cross-validation. Significant differences were obtained between the alternatives for the different types of movement, demonstrating the need of adapting feature extraction and classification modules to each type of movement. The best results on Pamap2 dataset showed accuracies of 96.7% and 88.7% for repetitive movements and postures, respectively. On Opportunity dataset, the best results reported accuracies of 66.9% and 97.1% for non-repetitive movements and postures, respectively.
[Display omitted] |
---|---|
ISSN: | 0045-7906 1879-0755 |
DOI: | 10.1016/j.compeleceng.2020.106822 |