LOFF: LiDAR and Optical Flow Fusion Odometry

Simultaneous Location and Mapping (SLAM) is a common algorithm for position estimation in GNSS-denied environments. However, the high structural consistency and low lighting conditions in tunnel environments pose challenges for traditional visual SLAM and LiDAR SLAM. To this end, this paper presents...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Drones (Basel) 2024-08, Vol.8 (8), p.411
Hauptverfasser: Zhang, Junrui, Huang, Zhongbo, Zhu, Xingbao, Guo, Fenghe, Sun, Chenyang, Zhan, Quanxi, Shen, Runjie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Simultaneous Location and Mapping (SLAM) is a common algorithm for position estimation in GNSS-denied environments. However, the high structural consistency and low lighting conditions in tunnel environments pose challenges for traditional visual SLAM and LiDAR SLAM. To this end, this paper presents LiDAR and optical flow fusion odometry (LOFF), which uses a direction-separated data fusion method to fuse optical flow odometry into the degenerate direction of the LiDAR SLAM without sacrificing the accuracy. Moreover, LOFF incorporates detectors and a compensator, allowing for a smooth transition between general environments and degeneracy environments. This capability facilitates the stable flight of unmanned aerial vehicles (UAVs) in GNSS-denied tunnel environments, including corners and long-distance consistency. Through real-world experiments conducted in a GNSS-denied pedestrian tunnel, we demonstrate the superior position accuracy and trajectory smoothness of LOFF compared to state-of-the-art visual SLAM and LiDAR SLAM.
ISSN:2504-446X
2504-446X
DOI:10.3390/drones8080411