Semi-Direct Visual Odometry and Mapping System with RGB-D Camera

TP242.6; In this paper a semi-direct visual odometry and mapping system is proposed with a RGB-D camera, which combines the merits of both feature based and direct based methods. The presented system directly estimates the camera motion of two consecutive RGB-D frames by minimizing the photometric e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:北京理工大学学报(英文版) 2019-03, Vol.28 (1), p.83-93
Hauptverfasser: Xinliang Zhong, Xiao Luo, Jiaheng Zhao, Yutong Huang
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:TP242.6; In this paper a semi-direct visual odometry and mapping system is proposed with a RGB-D camera, which combines the merits of both feature based and direct based methods. The presented system directly estimates the camera motion of two consecutive RGB-D frames by minimizing the photometric error. To permit outliers and noise, a robust sensor model built upon the t-distribution and an error function mixing depth and photometric errors are used to enhance the accuracy and ro-bustness. Local graph optimization based on key frames is used to reduce the accumulative error and refine the local map. The loop closure detection method, which combines the appearance similarity method and spatial location constraints method, increases the speed of detection. Experimental re-sults demonstrate that the proposed approach achieves higher accuracy on the motion estimation and environment reconstruction compared to the other state-of-the-art methods. Moreover, the proposed approach works in real-time on a laptop without a GPU, which makes it attractive for robots e-quipped with limited computational resources.
ISSN:1004-0579
DOI:10.15918/j.jbit1004-0579.17149