IMU-Assisted Uncertainty-Weighted Attitude Estimation Algorithm From Noncorresponding Points
Attitude estimation from unknown corresponding points plays a crucial role in visual tasks involving feature points, but current solutions suffer from slow computation and poor tolerance to noise. In order to address these drawbacks, this article proposes an inertial measurement unit (IMU)-assisted...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2024-03, Vol.24 (6), p.8281-8292 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Attitude estimation from unknown corresponding points plays a crucial role in visual tasks involving feature points, but current solutions suffer from slow computation and poor tolerance to noise. In order to address these drawbacks, this article proposes an inertial measurement unit (IMU)-assisted uncertainty-weighted attitude estimation algorithm from noncorresponding points. The algorithm enhances both computation speed and robustness to anisotropic noise. The method begins with an initial pose generator based on preintegration, optimizing the selection of the current frame's initial pose using the previous frame's pose and IMU measurements between them. These strategies enhance the calculation speed. Moreover, the feature point extraction method is further optimized, leveraging the covariance matrix to estimate the error associated with the feature points. Consequently, a novel spatial collinearity error function based on SoftAssign is formulated by integrating the Mahalanobis distance. This approach ensures reliable attitude estimation and correspondence determination. Simulation and experimental results demonstrate that the proposed method significantly improves computational speed while achieving superior accuracy and robustness compared to existing methods. |
---|---|
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2024.3355907 |