A model-based object following system
In this paper we describe an object following system for ground robot mobility, which incorporates LIDAR-based object perception and model-based lane estimation into control signal generation. The approach enables our autonomous ground vehicle MuCAR-3 to safely follow an object even in curved, narro...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper we describe an object following system for ground robot mobility, which incorporates LIDAR-based object perception and model-based lane estimation into control signal generation. The approach enables our autonomous ground vehicle MuCAR-3 to safely follow an object even in curved, narrow roads without using GPS or any prior environmental information at all, and to push the follower vehicle backwards in case of dead ends or blocked roads. The effectiveness of this approach originates from a tight coupling between object recognition and control signal generation. Objects are detected, classified and tracked using a unique combination of 3D point clouds and a 2frac12D occupancy grid. With the object information gained, a Kalman filter is used for lane estimation. Furthermore to cope with the problem of local obstacle avoidance, a set of drivable primitives, called tentacles, is integrated into the system. Using parameters from both, a controller generates an appropriate control signal for underlying vehicle control circuits. With this approach we are able to demonstrate smooth steering behavior at speeds up to 20 m/s while following an object even in rough terrain with high precession. The system was tested in various urban and non-urban scenarios like inner city traffic with crossings including stop lights, as well as roundabouts and pedestrian areas, which requires accurate lane execution. |
---|---|
ISSN: | 1931-0587 2642-7214 |
DOI: | 10.1109/IVS.2009.5164285 |