A Semantic Guidance and Transformer-based Matching Method for UAVs and Satellite Images for UAV Geo-localization

It is a challenging task for unmanned aerial vehicles (UAVs) without a positioning system to locate targets by using images. Matching drone and satellite images is one of the key steps in this task. Due to the large angle and scale gaps between drone and satellite views, it is very important to extr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022-01, Vol.10, p.1-1
Hauptverfasser: Zhuang, Jiedong, Chen, Xuruoyan, Dai, Ming, Lan, Wenbo, Cai, Yongheng, Zheng, Enhui
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It is a challenging task for unmanned aerial vehicles (UAVs) without a positioning system to locate targets by using images. Matching drone and satellite images is one of the key steps in this task. Due to the large angle and scale gaps between drone and satellite views, it is very important to extract fine-grained features with strong characterization ability. Most of the published methods are based on the CNN structure, but a lot of information will be lost when using such methods. This is caused by the limitations of the convolution operation (e.g., limited receptive field and downsampling operation). To make up for this shortcoming, a transformer-based network is proposed to extract more contextual information. The network promotes feature alignment through a semantic guidance module (SGM). This module aligns the same semantic parts in the two images by classifying each pixel in the images based on the attention of pixels. In addition, this method can be easily combined with existing methods. The proposed method has been implemented with the newest UAV-based geo-localization dataset. Compared with the existing state-of-the-art (SOTA) method, the proposed method achieves an almost 8% improvement in accuracy.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3162693