LSVL: Large-scale season-invariant visual localization for UAVs
Localization of autonomous unmanned aerial vehicles (UAVs) relies heavily on Global Navigation Satellite Systems (GNSS), which are susceptible to interference. Especially in security applications, robust localization algorithms independent of GNSS are needed to provide dependable operations of auton...
Gespeichert in:
Veröffentlicht in: | Robotics and autonomous systems 2023-10, Vol.168, p.104497, Article 104497 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Localization of autonomous unmanned aerial vehicles (UAVs) relies heavily on Global Navigation Satellite Systems (GNSS), which are susceptible to interference. Especially in security applications, robust localization algorithms independent of GNSS are needed to provide dependable operations of autonomous UAVs also in interfered conditions. Typical non-GNSS visual localization approaches rely on known starting pose, work only on a small-sized map, or require known flight paths before a mission starts. We consider the problem of localization with no information on initial pose or planned flight path. We propose a solution for global visual localization on large maps, based on matching orthoprojected UAV images to satellite imagery using learned season-invariant descriptors, and test with environment sizes up to 100 km2. We show that the method is able to determine heading, latitude and longitude of the UAV at 12.6–18.7 m lateral translation error in as few as 23.2–44.4 updates from an uninformed initialization, also in situations of significant seasonal appearance difference (winter–summer) between the UAV image and the map. We evaluate the characteristics of multiple neural network architectures for generating the descriptors, and likelihood estimation methods that are able to provide fast convergence and low localization error. We also evaluate the operation of the algorithm using real UAV data and evaluate running time on a real-time embedded platform. We believe this is the first work that is able to recover the pose of an UAV at this scale and rate of convergence, while allowing significant seasonal difference between camera observations and map.
•A UAV localization approach for solving the wake-up robot problem at large scale.•Tolerates significant seasonal appearance change and ambiguity of environment.•Converges to 12.6–18.7 m translation error starting from 100 km2 uncertainty. |
---|---|
ISSN: | 0921-8890 1872-793X |
DOI: | 10.1016/j.robot.2023.104497 |