Camera Pose Estimation Based on PnL With a Known Vertical Direction

In this letter, we address the problem of camera pose estimation using two-dimensional (2D) and 3-D line features, also known as PnL (Perspective-n-Line) with a known vertical direction. The minimal number of line correspondences required to estimate the complete camera pose is 3 (P3L) in the genera...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters 2019-10, Vol.4 (4), p.3852-3859
Hauptverfasser: Lecrosnier, Louis, Boutteau, Remi, Vasseur, Pascal, Savatier, Xavier, Fraundorfer, Friedrich
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this letter, we address the problem of camera pose estimation using two-dimensional (2D) and 3-D line features, also known as PnL (Perspective-n-Line) with a known vertical direction. The minimal number of line correspondences required to estimate the complete camera pose is 3 (P3L) in the general case, yielding to a minimum of eight possible solutions. Prior knowledge of the vertical direction, such as provided by common sensors (e.g., inertial measurement unit, or IMU), reduces the problem to a 4 DoF problem and outputs a single solution. We benefit this fact to decouple the remaining rotation estimation and the translation estimation and we present a twofold contribution: First, we present a linear formulation of the PnL problem in Plücker lines coordinates with a known vertical direction, including a Gauss-Newton-based orientation and location refinement to compensate IMU sensor noise. Second, we propose a new efficient RANdom SAmple Consensus (RANSAC) scheme for both feature pairing and outliers rejection based solely on rotation estimation from two line pairs. This greatly diminishes the computational cost compared to a RANSAC3 or RANSAC4 scheme. We evaluate our algorithms on synthetic data and on our own real dataset. Experimental results show the state-of-the-art results in term of accuracy and runtime, when facing 2-D noise, 3-D noise, and vertical direction sensor noise.
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2019.2929982