Deep Neural Networks for Sensor-Based Human Activity Recognition Using Selective Kernel Convolution

Recently, the state-of-the-art performance in various sensor-based human activity recognition (HAR) tasks has been acquired by deep learning, which can extract automatically features from raw data. In standard convolutional neural networks (CNNs), there is usually the same receptive field (RF) size...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2021, Vol.70, p.1-13, Article 2512313
Hauptverfasser: Gao, Wenbin, Zhang, Lei, Huang, Wenbo, Min, Fuhong, He, Jun, Song, Aiguo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, the state-of-the-art performance in various sensor-based human activity recognition (HAR) tasks has been acquired by deep learning, which can extract automatically features from raw data. In standard convolutional neural networks (CNNs), there is usually the same receptive field (RF) size of artificial neurons within each feature layer. It is well known that the RF size of neurons is able to change adaptively according to the stimulus, which has rarely been exploited in HAR. In this article, a new multibranch CNN is introduced, which utilizes a selective kernel mechanism for HAR. To the best of our knowledge, it is for the first time to adopt an attention idea to perform kernel selection among multiple branches with different RFs in the HAR scenario. We perform extensive experiments on several benchmark HAR datasets, namely, UCI-HAR, UNIMIB SHAR, WISDM, PAMAP2, and OPPORTUNITY, as well as weakly labeled datasets. Ablation experiments show that the selective kernel convolution can adaptively choose an appropriate RF size among multiple branches for classifying numerous human activities. As a result, it can achieve a higher recognition accuracy under a similar computing budget.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2021.3102735