Time Analysis in Human Activity Recognition

Continuous human activity recognition from inertial signals is performed by splitting these temporal signals into time windows and identifying the activity in each window. Defining the appropriate window duration has been the target of several previous works. In most of these analyses, the recogniti...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural processing letters 2021-12, Vol.53 (6), p.4507-4525
Hauptverfasser: Gil-Martín, Manuel, San-Segundo, Rubén, Fernández-Martínez, Fernando, Ferreiros-López, Javier
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Continuous human activity recognition from inertial signals is performed by splitting these temporal signals into time windows and identifying the activity in each window. Defining the appropriate window duration has been the target of several previous works. In most of these analyses, the recognition performance increases with the windows duration until an optimal value and decreases or saturates for longer windows. This paper evaluates several strategies to combine sub-window information inside a window, obtaining important improvements for long windows. This evaluation was performed using a state-of-the-art human activity recognition system based on Convolutional Neural Networks (CNNs). This deep neural network includes convolutional layers to learn features from signal spectra and additional fully connected layers to classify the activity at each window. All the analyses were carried out using two public datasets (PAMAP2 and USC-HAD) and a Leave-One-Subject-Out (LOSO) cross-validation. For 10-s windows, the accuracy increased from 90.1 (± 0.66) to 94.27 (± 0.46) in PAMAP2 and from 80.54 (± 0.73) to 84.46 (± 0.67) in USC-HAD. For 20-s windows, the improvements were from 92.66 (± 0.58) to 96.35 (± 0.38) (PAMAP2) and from 78.39 (± 0.76) to 86.36 (± 0.57) (USC-HAD).
ISSN:1370-4621
1573-773X
DOI:10.1007/s11063-021-10611-w