Road Environment Semantic Segmentation with Deep Learning from MLS Point Cloud Data
In the near future, the communication between autonomous cars will produce a network of sensors that will allow us to know the state of the roads in real time Lidar technology, upon which most autonomous cars are based, allows the acquisition of 3D geometric information of the environment. The objec...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2019-08, Vol.19 (16), p.3466 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the near future, the communication between autonomous cars will produce a network of sensors that will allow us to know the state of the roads in real time
Lidar technology, upon which most autonomous cars are based, allows the acquisition of 3D geometric information of the environment. The objective of this work is to use point clouds acquired by Mobile Laser Scanning (MLS) to segment the main elements of road environment (road surface, ditches, guardrails, fences, embankments, and borders) through the use of PointNet. Previously, the point cloud was automatically divided into sections in order for semantic segmentation to be scalable to different case studies, regardless of their shape or length. An overall accuracy of 92.5% has been obtained, but with large variations between classes. Elements with a greater number of points have been segmented more effectively than the other elements. In comparison with other point-by-point extraction and ANN-based classification techniques, the same success rates have been obtained for road surfaces and fences, and better results have been obtained for guardrails. Semantic segmentation with PointNet is suitable when segmenting the scene as a whole, however, if certain classes have more interest, there are other alternatives that do not need a high training cost. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s19163466 |