Automatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information

This paper reports on an algorithm for automatic, targetless, extrinsic calibration of a lidar and optical camera system based upon the maximization of mutual information between the sensor‐measured surface intensities. The proposed method is completely data‐driven and does not require any fiducial...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of field robotics 2015-08, Vol.32 (5), p.696-722
Hauptverfasser: Pandey, Gaurav, McBride, James R., Savarese, Silvio, Eustice, Ryan M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper reports on an algorithm for automatic, targetless, extrinsic calibration of a lidar and optical camera system based upon the maximization of mutual information between the sensor‐measured surface intensities. The proposed method is completely data‐driven and does not require any fiducial calibration targets—making in situ calibration easy. We calculate the Cramér‐Rao lower bound (CRLB) of the estimated calibration parameter variance, and we show experimentally that the sample variance of the estimated parameters empirically approaches the CRLB when the amount of data used for calibration is sufficiently large. Furthermore, we compare the calibration results to independent ground‐truth (where available) and observe that the mean error empirically approaches zero as the amount of data used for calibration is increased, thereby suggesting that the proposed estimator is a minimum variance unbiased estimate of the calibration parameters. Experimental results are presented for three different lidar‐camera systems: (i) a three‐dimensional (3D) lidar and omnidirectional camera, (ii) a 3D time‐of‐flight sensor and monocular camera, and (iii) a 2D lidar and monocular camera.
ISSN:1556-4959
1556-4967
DOI:10.1002/rob.21542