Attention Mask-Based Network With Simple Color Annotation for UAV Vehicle Re-Identification
Vehicle re-identification (VeID) has attracted a growing research interest in recent years, and excellent performance has been shown with fixed traffic cameras. However, vehicle ReID in aerial images taken by unmanned aerial vehicles (UAVs), possessing both variable locations and special viewpoints,...
Gespeichert in:
Veröffentlicht in: | IEEE geoscience and remote sensing letters 2022, Vol.19, p.1-5 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Vehicle re-identification (VeID) has attracted a growing research interest in recent years, and excellent performance has been shown with fixed traffic cameras. However, vehicle ReID in aerial images taken by unmanned aerial vehicles (UAVs), possessing both variable locations and special viewpoints, is still under-explored. Recent works tend to extract meaningful local features by careful annotation, which are effective but time-consuming. In order to extract discriminative features and avoid tedious annotating work, this letter develops an attention mask (AM)-based network with simple color annotation for object enhancement and background reduction. The network makes full use of deep features obtained by a pretrained color classification network and then utilizes principal component analysis (PCA) as a mapping function to achieve AMs without partial annotation. Besides, we introduce weighted triplet loss (WTL) function to deal with the problem of great similarity between classes caused by overlook views of UAVs. The loss function concentrates more on negative pairs to facilitate the identification ability of network. Rich experiments are conducted on both UAV dataset and surveillance dataset, and our method achieves competitive performance compared with recent ReID works. |
---|---|
ISSN: | 1545-598X 1558-0571 |
DOI: | 10.1109/LGRS.2021.3092369 |