Real-time dense 3D reconstruction and camera tracking via embedded planes representation

This paper proposes a novel approach for robust plane matching and real-time RGB-D fusion based on the representation of plane parameter space. In contrast to previous planar-based SLAM algorithms estimating correspondences for each plane-pair independently, our method instead explores the holistic...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Visual computer 2020-10, Vol.36 (10-12), p.2215-2226
Hauptverfasser: Fu, Yanping, Yan, Qingan, Liao, Jie, Chow, Alix L. H., Xiao, Chunxia
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper proposes a novel approach for robust plane matching and real-time RGB-D fusion based on the representation of plane parameter space. In contrast to previous planar-based SLAM algorithms estimating correspondences for each plane-pair independently, our method instead explores the holistic topology of all relevant planes. We note that by adopting the low-dimensionality parameter space representation, the plane matching can be intuitively reformulated and solved as a point cloud registration problem. Besides estimating the plane correspondences, we contribute an efficient optimization framework, which employs both frame-to-frame and frame-to-model planar consistency constraints. We propose a global plane map to dynamically represent the reconstructed scene and alleviate accumulation errors that exist in camera pose tracking. We validate the proposed algorithm on standard benchmark datasets and additional challenging real-world environments. The experimental results demonstrate its outperformance to current state-of-the-art methods in tracking robustness and reconstruction fidelity.
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-020-01899-1