Mode-Independent Stride Length Estimation With IMUs in Smartphones

Robust and accurate human stride length estimation (SLE) using smartphone integrated inertial measurement units (IMU) is essential in pedestrian dead reckoning (PDR) and mobile health applications. However, the change of smartphone carrying mode (i.e. sensor location) in daily usage often leads to s...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2022-03, Vol.22 (6), p.5824-5833
Hauptverfasser: Bo, Fan, Li, Jia, Wang, Weibing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Robust and accurate human stride length estimation (SLE) using smartphone integrated inertial measurement units (IMU) is essential in pedestrian dead reckoning (PDR) and mobile health applications. However, the change of smartphone carrying mode (i.e. sensor location) in daily usage often leads to significant estimation errors. To address this problem, we propose a novel SLE framework called Mode-Independent Neural Network (MINN) using multi-source unsupervised domain adaptation (UDA) methods. First, we present a hierarchical neural network to extract spatio-temporal features based on multi-level ResNet and GRU. Then, we use adversarial training and a subclass classifier to build a UDA network that can extract mode-invariant features shared by the data from different modes. Finally, we integrate these architectures into an end-to-end learning framework. Through a systematic evaluation under the leave-one-out setting on two public SLE datasets, the MINN outperforms the state-of-the-art algorithms by achieving stride length error rates of 2.5% and 5.1% in supervised settings. We also evaluated the mode-independent adaptability of this model by performing single and multiple UDA tasks. The results demonstrate that the proposed MINN significantly improves the generalization of SLE model under new subjects or modes.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2022.3148313