Robust point cloud registration for map-based autonomous robot navigation

Autonomous navigation in large-scale and complex environments in the absence of a GPS signal is a fundamental challenge encountered in a variety of applications. Since 3-D scans provide inherent robustness to ambient illumination changes and the type of the surface texture, we present Point Cloud Ma...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:EURASIP journal on advances in signal processing 2024-12, Vol.2024 (1), p.57-25, Article 57
Hauptverfasser: Efraim, Amit, Francos, Joseph M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Autonomous navigation in large-scale and complex environments in the absence of a GPS signal is a fundamental challenge encountered in a variety of applications. Since 3-D scans provide inherent robustness to ambient illumination changes and the type of the surface texture, we present Point Cloud Map-based Navigation (PCMN), a robust robot navigation system, based exclusively on 3-D point cloud registration between an acquired observation and a stored reference map. It provides a drift-free navigation solution, equipped with a failed registration detection capability. The backbone of the navigation system is a robust point cloud registration method, of the acquired observation to the stored reference map. The proposed registration algorithm follows a hypotheses generation and evaluation paradigm, where multiple statistically independent hypotheses are generated from local neighborhoods of putative matching points. Then, hypotheses are evaluated using a multiple consensus analysis that integrates evaluation of the point cloud feature correlation and a consensus test on the Special Euclidean Group SE(3) based on independent hypothesized estimates. The proposed PCMN is shown to achieve significantly better performance than state-of-the-art methods, both in terms of place recognition recall and localization accuracy, achieving submesh resolution accuracy, both for indoor and outdoor settings.
ISSN:1687-6180
1687-6172
1687-6180
DOI:10.1186/s13634-024-01153-z