Vehicle Localization and Classification Using Off-Board Vision and 3-D Models

Vehicle localization is one of the most common tasks in robotics. Using vision as a sensor, most methods perform localization from cameras mounted on board the vehicle. In contrast, we propose a method based on an off-board camera. The system uses a similarity measure between a camera image and a sy...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on robotics 2014-04, Vol.30 (2), p.432-447
Hauptverfasser: Hoermann, Stefan, Borges, Paulo Vinicius Koerich
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Vehicle localization is one of the most common tasks in robotics. Using vision as a sensor, most methods perform localization from cameras mounted on board the vehicle. In contrast, we propose a method based on an off-board camera. The system uses a similarity measure between a camera image and a synthetic image generated using a 3-D vehicle model, ideally converging to the true pose of the vehicle. The similarity measure is based on a model of shading appearance, which depends on the surface curvature of the 3-D model. Considering that the area observed by the camera is fairly planar, a rough initial estimation for the position of the object can be obtained using 2-D blob tracking. The 3-D model of the vehicle can be rendered near the vehicle in the real image, and using the proposed similarity measure, it converges to the correct pose. A classification function to discriminate between different vehicles is also proposed, making it possible for the system to identify and track multiple vehicles of interest. A number of experiments are performed with different vehicles outdoors, in a real industrial environment, considering different illumination conditions. The low average error rate compared with a laser-based ground truth illustrates the applicability of the method.
ISSN:1552-3098
1941-0468
DOI:10.1109/TRO.2013.2291613