Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking

Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the l...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE geoscience and remote sensing letters 2024, Vol.21, p.1-5
Hauptverfasser: Deng, Anping, Chen, Dianbing, Han, Guangliang, Yang, Hang, Liu, Zhichao, Liu, Faxue
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the limited computational capabilities of the onboard computing platform, the existing tracking networks struggle to effectively integrate superior tracking performance with efficient computational speed. Addressing this pivotal issue, we proposed the Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial tracking (SFTrack). Specifically, we designed the separation fusion transformer (SFT), leveraging a meticulously crafted low-latency transformer architecture to extract robust feature information of the target, thereby enhancing the algorithm's ability to distinguish the target from intricate backgrounds. Furthermore, the efficient reuse matching network (ERM) efficiently performs feature matching to accurately determine the target's scale and location. SFTrack exhibits efficient and precise performance in drone tracking tasks. When compared to other drone-to-ground tracking algorithms, SFTrack demonstrates a significant lead in accuracy across multiple datasets, achieving an impressive 54 frames per second (FPS) on embedded platforms. This validates the feasibility and effectiveness of our tracker in practical drone deployments.
ISSN:1545-598X
1558-0571
DOI:10.1109/LGRS.2024.3436846