Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM

Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2024-01, Vol.73, p.1-1
Hauptverfasser: Xie, Hongle, Deng, Tianchen, Wang, Jingchuan, Chen, Weidong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate outliers by fitting a geometric model, which have high complexity and rely on model hypothesis, we present a general and efficient model-free scheme to address these challenges. In particular, we propose a novel uniform bipartite motion field (UBMF) to exactly measure the spatial transforms of sparse feature correspondences in consecutive frames. Moreover, a new recursive angular tracking consistency (RATC) guided fast feature association algorithm is designed, which can efficiently select correspondence and update UBMF simultaneously, and it also holds the linear computational complexity and theoretical performance guarantee. Furthermore, we develop a lightweight angular tracking consistency guided visual-inertial SLAM (ATVIS) system, which achieves better robustness and outperforms the state-of-the-art methods. Massive qualitative and quantitative validations are conducted by both public benchmarks and different real-world experiments, which extensively demonstrate the superiority of our method in both localization accuracy and computational efficiency.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2023.3348902