An unsupervised statistical representation learning method for human activity recognition
With the evolution of smart devices like smartphones, smartwatches, and other wearable devices, motion sensors have been integrated into these devices to collect data and analyze human activities. Consequently, sensor-based Human Activity Recognition (HAR) has emerged as a significant research area...
Gespeichert in:
Veröffentlicht in: | Signal, image and video processing image and video processing, 2024-09, Vol.18 (10), p.7041-7052 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the evolution of smart devices like smartphones, smartwatches, and other wearable devices, motion sensors have been integrated into these devices to collect data and analyze human activities. Consequently, sensor-based Human Activity Recognition (HAR) has emerged as a significant research area in the fields of ubiquitous computing and wearable computing. This paper presents a novel approach that employs Latent Dirichlet Allocation (LDA) to extract meaningful representations from activity signals. The method involves transforming the activity signal, which is a sequence of samples, into a sequence of discrete symbols using vector quantization. Subsequently, LDA is utilized to embed the symbol sequence into a fixed-length representation vector. Finally, a classifier is employed to classify the obtained representation vector. The effectiveness of the proposed method is evaluated using the UNIMIB-SHAR dataset. Experimental results demonstrate its competitive performance in terms of accuracy and F1-score metrics when compared to existing methods. Moreover, our method boasts a more lightweight architecture and incurs lower computational costs compared to deep learning-based approaches. The findings of this study contribute to the advancement of HAR and hold practical implications for HAR systems. |
---|---|
ISSN: | 1863-1703 1863-1711 |
DOI: | 10.1007/s11760-024-03374-z |