Pedestrian Stride Length Estimation Based on Bidirectional LSTM and CNN Architecture
An increasing number of stride length estimation (SLE) algorithms adopt deep learning due to its ability to automatically learn complex nonlinearities in the data. However, datasets with true stride length are small in size, posing a threat to network performance with overfitting. Furthermore, the i...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.124718-124728 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | An increasing number of stride length estimation (SLE) algorithms adopt deep learning due to its ability to automatically learn complex nonlinearities in the data. However, datasets with true stride length are small in size, posing a threat to network performance with overfitting. Furthermore, the inertial sensor data collected from users are essentially time-series data with spatial features vary depending on individual walking characteristics. The variety of users and motions increases the complexity of the SLE. Therefore, to bypass the issue of a small-sized training dataset and effectively capture spatio-temporal features inherent in the data, we propose a bidirectional long short-term memory (BiLSTM) combined with a convolutional neural network (CNN) architecture, termed BiLSTM/CNN, which we train with our data augmentation method that uses random rotation matrix. Through quantitative and qualitative evaluations on three public datasets, we prove that the proposed method outperforms state-of-the-art algorithms in terms of walking distance error rate and stride length error rate, achieving 2.62% and 5.60%, respectively. The results demonstrate that the BiLSTM/CNN and data augmentation method improve the generalization ability of the network for different users engaging in daily life activities and various motion modes. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3454049 |