Building and optimization of 3D semantic map based on Lidar and camera fusion

When considering the robot application of the complex scenarios, the traditional geometric maps are insufficient because of the lack of interactions with the environment. In this paper, a three-dimensional (3D) semantic map with large-scale and accurate integrating Lidar and camera information is pr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) 2020-10, Vol.409, p.394-407
Hauptverfasser: Li, Jing, Zhang, Xin, Li, Jiehao, Liu, Yanyu, Wang, Junzheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:When considering the robot application of the complex scenarios, the traditional geometric maps are insufficient because of the lack of interactions with the environment. In this paper, a three-dimensional (3D) semantic map with large-scale and accurate integrating Lidar and camera information is presented to achieve real-time road scenes. Firstly, simultaneous localization and mapping (SLAM) is performed to locate the robot position with the multi-sensor fusion of the Lidar and inertial measurement unit (IMU), and the map of the surrounding scenes is constructed while the robot is moving. Moreover, a convolutional neural networks (CNNs)-based semantic segmentation of images is employed to develop the semantic map of the environment. Following the synchronization of the time and space, the sensor fusion of Lidar and camera are used to generate the semantic labeled frame of point clouds and then create a semantic map in term of the posture. Besides, improving the capacity of classification, a higher-order 3D full connection conditional random fields (CRFs) method is utilized to optimize the semantic map. Finally, extensive experiment results evaluated on the KITTI dataset have illustrated the effectiveness of the proposed method.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2020.06.004