Exploring a rich spatial–temporal dependent relational model for skeleton-based action recognition by bidirectional LSTM-CNN

With the fast development of effective and low-cost human skeleton capture systems, skeleton-based action recognition has attracted much attention recently. Most existing methods using Convolutional Neural Networks (CNN) and Long Short Term Memory (LSTM) have achieved promising performance for skele...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) 2020-11, Vol.414, p.90-100
Hauptverfasser: Zhu, Aichun, Wu, Qianyu, Cui, Ran, Wang, Tian, Hang, Wenlong, Hua, Gang, Snoussi, Hichem
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the fast development of effective and low-cost human skeleton capture systems, skeleton-based action recognition has attracted much attention recently. Most existing methods using Convolutional Neural Networks (CNN) and Long Short Term Memory (LSTM) have achieved promising performance for skeleton-based action recognition. However, these approaches are limited in the ability to explore the rich spatial–temporal relational information. In this paper, we propose a new spatial–temporal model with an end-to-end bidirectional LSTM-CNN (BiLSTM-CNN). First, a hierarchical spatial–temporal dependent relational model is used to explore rich spatial–temporal information in the skeleton data. Then a new framework is proposed to fuse CNN and LSTM. In this framework, the skeleton data are built by the dependent relational model and serve as the input of the proposed network. Then LSTM is used to extract the temporal features, and followed by a standard CNN to explore the spatial information from the output of LSTM. Finally, the experimental results demonstrate the effectiveness of the proposed model on the NTU RGB+D, SBU Interaction and UTD-MHAD dataset.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2020.07.068