Tightly-coupled robust vision aided inertial navigation algorithm for augmented reality using monocular camera and IMU
Odometry component of a camera tracking system for augmented reality applications is described. The system uses a MEMS-type inertial measurement unit (IMU) with 3-axis gyroscopes and accelerometers and a monocular camera to accurately and robustly track the camera motion in 6 degrees of freedom (wit...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng ; jpn |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Odometry component of a camera tracking system for augmented reality applications is described. The system uses a MEMS-type inertial measurement unit (IMU) with 3-axis gyroscopes and accelerometers and a monocular camera to accurately and robustly track the camera motion in 6 degrees of freedom (with correct scale) in arbitrary indoor or outdoor scenes. Tight coupling of IMU and camera is achieved by an error-state extended Kalman filter (EKF) which performs sensor fusion for inertial navigation at a deep level such that each visually tracked feature contributes as an individual measurement as opposed to the more traditional approaches where camera pose estimates are first extracted by means of feature tracking and then used as measurement updates in a filter framework. Robustness, on the other hand, is achieved by using a geometric hypothesize-and-test architecture based on the five-point relative pose estimation method, rather than a Mahalanobis distance type gating mechanism derived from the Kalman filter state prediction, to select the inlier tracks and remove outliers from the raw feature point matches which would otherwise corrupt the filter since tracks are directly used as measurements. |
---|---|
DOI: | 10.1109/ISMAR.2011.6143485 |