A linear method for calibrating LIDAR-and-camera systems
This article describes a multimedia system consisting of two sensors: (1) a laser range scanner (LIDAR) and (2) a conventional digital camera. Our work specifies a mathematical calibration model that allows for this data to be explicitly integrated. Data integration is accomplished by calibrating th...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This article describes a multimedia system consisting of two sensors: (1) a laser range scanner (LIDAR) and (2) a conventional digital camera. Our work specifies a mathematical calibration model that allows for this data to be explicitly integrated. Data integration is accomplished by calibrating the system, i.e., estimating for each variable of the model for a specific LIDAR-and-camera pair. Our approach requires detection of feature points in both the LIDAR scan and the digital images. Using correspondences between feature points, we can then estimate the model variables that specify an explicit mathematical relationship between sensed (x, y, z) LIDAR points and (x, y) digital image positions. Our system is designed for 3D line scanners, i.e., scanners that detect positions that lie in a 3D plane which requires some special theoretical and experimental treatment. Results are provided for simulations of the system in a virtual environment and for a real LIDAR-and-camera system consisting of a SICK LMS200 and an inexpensive USB web-camera. Calibrated systems can integrate the data in real-time which is of particular use for autonomous vehicular and robotic navigation. |
---|---|
ISSN: | 1526-7539 2375-0227 |
DOI: | 10.1109/MASCOT.2009.5366801 |