Poses as Queries: End-to-End Image-to-LiDAR Map Localization With Transformers
High-precision vehicle localization with commercial setups is a crucial technique for high-level autonomous driving tasks. As a newly emerged approach, monocular localization in LiDAR map achieves promising balance between cost and accuracy, but estimating pose by finding correspondences between suc...
Gespeichert in:
Veröffentlicht in: | IEEE robotics and automation letters 2024-01, Vol.9 (1), p.803-810 |
---|---|
Hauptverfasser: | , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | High-precision vehicle localization with commercial setups is a crucial technique for high-level autonomous driving tasks. As a newly emerged approach, monocular localization in LiDAR map achieves promising balance between cost and accuracy, but estimating pose by finding correspondences between such cross-modal sensor data is challenging, thereby damaging the localization accuracy. In this letter, we address the problem by proposing a novel Transformer-based neural network to register 2D images into 3D LiDAR map in an end-to-end manner. We first implicitly represent poses as high-dimensional feature vectors called pose queries and gradually optimize poses by interacting with the retrieved relevant information from cross-modal features using attention mechanism in a proposed POse Estimator Transformer (POET) module. Moreover, we apply a multiple hypotheses aggregation method that estimates the final poses by performing parallel optimization on multiple randomly initialized pose queries to reduce the network uncertainty. Comprehensive analysis and experimental results on public benchmarks conclude that the proposed image-to-LiDAR map localization network could achieve state-of-the-art performances in challenging cross-modal localization tasks. |
---|---|
ISSN: | 2377-3766 2377-3766 |
DOI: | 10.1109/LRA.2023.3337704 |