Smartphone-based gait recognition using convolutional neural networks and dual-tree complex wavelet transform

Gait recognition is an efficient way of identifying people from their walking behavior, using inertial sensors integrated into the smartphones. These inertial sensors such as accelerometers and gyroscopes easily collect the gait data used by the existing deep learning-based gait recognition methods....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia systems 2022-12, Vol.28 (6), p.2307-2317
Hauptverfasser: Sezavar, Ahmadreza, Atta, Randa, Ghanbari, Mohammad
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gait recognition is an efficient way of identifying people from their walking behavior, using inertial sensors integrated into the smartphones. These inertial sensors such as accelerometers and gyroscopes easily collect the gait data used by the existing deep learning-based gait recognition methods. Although these methods specifically, the hybrid deep neural networks, provide good gait feature representation, their recognition accuracy needs to be improved as well as reducing their computational cost. In this paper, a person identification framework from smartphone-acquired inertial gait signals is proposed to overcome these limitations. It is based on the combination of convolutional neural network (CNN) and dual-tree complex wavelet transform (DTCWT), named as CNN–DTCWT. In the proposed framework, global average pooling layer and DTCWT layer are integrated into the CNN to provide robust and highly accurate inertial gait feature representation. Experimental results demonstrate the superiority of the proposed structure over the state-of-the-art models. Tested on three data sets, it achieves higher recognition performance than the state-of-the-art CNN-based, LSTM-based models, and hybrid networks within average recognition accuracy improvements of 1.7–14.95%
ISSN:0942-4962
1432-1882
DOI:10.1007/s00530-022-00954-2