Wearable Ego-Motion Tracking for Blind Navigation in Indoor Environments

This paper proposes an ego-motion tracking method that utilizes visual-inertial sensors for wearable blind navigation. The unique challenge of wearable motion tracking is to cope with arbitrary body motion and complex environmental dynamics. We introduce a visual sanity check to select accurate visu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automation science and engineering 2015-10, Vol.12 (4), p.1181-1190
Hauptverfasser: He, Hongsheng, Li, Yan, Guan, Yong, Tan, Jindong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper proposes an ego-motion tracking method that utilizes visual-inertial sensors for wearable blind navigation. The unique challenge of wearable motion tracking is to cope with arbitrary body motion and complex environmental dynamics. We introduce a visual sanity check to select accurate visual estimations by comparing visually estimated rotation with measured rotation by a gyroscope. The movement trajectory is recovered through adaptive fusion of visual estimations and inertial measurements, where the visual estimation outputs motion transformation between consecutive image captures, and inertial sensors measure translational acceleration and angular velocities. The frame rates of visual and inertial sensors are different, and vary with respect to time owning to visual sanity checks. We hence employ a multirate extended Kalman filter (EKF) to fuse visual and inertial estimations. The proposed method was tested in different indoor environments, and the results show its effectiveness and accuracy in ego-motion tracking.
ISSN:1545-5955
1558-3783
DOI:10.1109/TASE.2015.2471175