Mono-Vision Based Lateral Localization System of Low-Cost Autonomous Vehicles Using Deep Learning Curb Detection
The localization system of low-cost autonomous vehicles such as autonomous sweeper requires a highly lateral localization accuracy as the vehicle needs to keep a near lateral-distance between the side brush system and the road curb. Existing methods usually rely on a global navigation satellite syst...
Gespeichert in:
Veröffentlicht in: | Actuators 2021-03, Vol.10 (3), p.57 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The localization system of low-cost autonomous vehicles such as autonomous sweeper requires a highly lateral localization accuracy as the vehicle needs to keep a near lateral-distance between the side brush system and the road curb. Existing methods usually rely on a global navigation satellite system that often loses signal in a cluttered environment such as sweeping streets between high buildings and trees. In a GPS-denied environment, map-based methods are often used such as visual and LiDAR odometry systems. Apart from heavy computation costs from feature extractions, they are too expensive to meet the low-price market of the low-cost autonomous vehicles. To address these issues, we propose a mono-vision based lateral localization system of an autonomous sweeper. Our system relies on a fish-eye camera and precisely detects road curbs with a deep curb detection network. Curbs locations are then referred to as straightforward marks to control the lateral motion of the vehicle. With our self-recorded dataset, our curb detection network achieves 93% pixel-level precision. In addition, experiments are performed with an intelligent sweeper to prove the accuracy and robustness of our proposed approach. Results demonstrate that the average lateral distance error and the maximum invalid rate are within 0.035 m and 9.2%, respectively. |
---|---|
ISSN: | 2076-0825 2076-0825 |
DOI: | 10.3390/act10030057 |