Multi-Branch CNN GRU with attention mechanism for human action recognition

For recognition of human actions, deep neural networks have been widely used in recent years including convolutional neural networks. They have gained very much popularity due to their effectiveness in feature representation over traditional approaches. But at the same time, deep learning networks f...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Engineering Research Express 2023-06, Vol.5 (2), p.25055
Hauptverfasser: Verma, Updesh, Tyagi, Pratibha, Aneja, Manpreet Kaur
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:For recognition of human actions, deep neural networks have been widely used in recent years including convolutional neural networks. They have gained very much popularity due to their effectiveness in feature representation over traditional approaches. But at the same time, deep learning networks faced some challenges such as the requirement of a sufficient amount of labelled data which are rarely available and the non-availability of computationally effective resources for deep neural networks. To overcome these challenges, multi-head deep learning architecture by combining Convolutional Neural Network (CNN) and Gated Recurrent Unit (GRU) with attention mechanism is proposed in this research for recognition of human actions. Three lightweight CNN heads GRU are utilized, and attention mechanisms are introduced in each head for the effective representation of important features and suppression of other undesired features. Three benchmark datasets PAMAP2, UCI-HAR and WISDM are taken for experimentation on proposed model. The performance of this architecture outperformed other models in terms of achieved accuracy, F-1 score and computational efficiency. The proposed approach obtained accuracy of 99.23%, 94.19% and 98.65% on WISDM, UCI-HAR and PAMAP2 datasets.
ISSN:2631-8695
2631-8695
DOI:10.1088/2631-8695/acd98c