Attend And Discriminate: Beyond the State-of-the-Art for Human Activity Recognition using Wearable Sensors
Wearables are fundamental to improving our understanding of human activities, especially for an increasing number of healthcare applications from rehabilitation to fine-grained gait analysis. Although our collective know-how to solve Human Activity Recognition (HAR) problems with wearables has progr...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Wearables are fundamental to improving our understanding of human activities,
especially for an increasing number of healthcare applications from
rehabilitation to fine-grained gait analysis. Although our collective know-how
to solve Human Activity Recognition (HAR) problems with wearables has
progressed immensely with end-to-end deep learning paradigms, several
fundamental opportunities remain overlooked. We rigorously explore these new
opportunities to learn enriched and highly discriminating activity
representations. We propose: i) learning to exploit the latent relationships
between multi-channel sensor modalities and specific activities; ii)
investigating the effectiveness of data-agnostic augmentation for multi-modal
sensor data streams to regularize deep HAR models; and iii) incorporating a
classification loss criterion to encourage minimal intra-class representation
differences whilst maximising inter-class differences to achieve more
discriminative features. Our contributions achieves new state-of-the-art
performance on four diverse activity recognition problem benchmarks with large
margins -- with up to 6% relative margin improvement. We extensively validate
the contributions from our design concepts through extensive experiments,
including activity misalignment measures, ablation studies and insights shared
through both quantitative and qualitative studies. |
---|---|
DOI: | 10.48550/arxiv.2007.07172 |