Band Regrouping and Response-Level Fusion for End-to-End Hyperspectral Object Tracking

Visual object tracking plays a fundamental role in computer vision. Extracting the unique spectral and spatial features of hyperspectral images (HSIs) can significantly improve tracking performance in complex scenarios, especially in hyperspectral object tracking. However, due to the limited trainin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE geoscience and remote sensing letters 2022, Vol.19, p.1-5
Hauptverfasser: Ouyang, Er, Wu, Jianhui, Li, Bin, Zhao, Lin, Hu, Wenjing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Visual object tracking plays a fundamental role in computer vision. Extracting the unique spectral and spatial features of hyperspectral images (HSIs) can significantly improve tracking performance in complex scenarios, especially in hyperspectral object tracking. However, due to the limited training samples, handcrafted features are employed in most current hyperspectral trackers, although they cannot sufficiently describe the intrinsic nature of the object. This letter proposes a band regrouping and response-level fusion network (BRRF-Net) for hyperspectral object tracking based on deep transfer learning, employing a deep model trained on color videos to represent features to solve this problem. Specifically, a new band regrouping subnetwork that generates the band weights using hyperspectral feature information is proposed. The bands are divided into several groups using band weights and imported into the Siamese network. Finally, the response-level fusion strategy is adopted to integrate the tracker results for the precise location of objects. Experiments on hyperspectral video reveal that the accuracy of the BRRF-Net is up to 0.689, which is the state-of-the-art performance compared with the current hyperspectral object trackers and proves the effectiveness and superiority of the BRRF-Net.
ISSN:1545-598X
1558-0571
DOI:10.1109/LGRS.2021.3137606