Automatic recovery of relative camera rotations for urban scenes

In this paper we describe a formulation of extrinsic camera calibration that decouples rotation from translation by exploiting properties inherent in urban scenes. We then present an algorithm which uses edge features to robustly and accurately estimate relative rotations among multiple cameras give...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Antone, M.E., Teller, S.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper we describe a formulation of extrinsic camera calibration that decouples rotation from translation by exploiting properties inherent in urban scenes. We then present an algorithm which uses edge features to robustly and accurately estimate relative rotations among multiple cameras given intrinsic calibration and approximate initial pose. The algorithm is linear both in the number of images and the number of features. We estimate the number and directions of vanishing points (VPs) with respect to each camera using a hybrid approach that combines the robustness of the Hough transform with the accuracy of expectation maximization. Matching and labeling methods identify unique VPs and correspond them across all cameras. Finally, a technique akin to bundle adjustment produces globally optimal estimates of relative camera rotations by bringing all VPs into optimal alignment. Uncertainty is modeled and used at every stage to improve accuracy. We assess the algorithm's performance on both synthetic and real data, and compare our results to those of semi-automated photogrammetric methods for a large set of real hemispherical images, using several consistency and error metrics.
ISSN:1063-6919
DOI:10.1109/CVPR.2000.854809