Sharp Attention Network via Adaptive Sampling for Person Re-Identification

In this paper, we present novel sharp attention networks by adaptively sampling feature maps from convolutional neural networks for person re-identification (re-ID) problems. Due to the introduction of sampling-based attention models, the proposed approach can adaptively generate sharper attention-a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on circuits and systems for video technology 2019-10, Vol.29 (10), p.3016-3027
Hauptverfasser: Shen, Chen, Qi, Guo-Jun, Jiang, Rongxin, Jin, Zhongming, Yong, Hongwei, Chen, Yaowu, Hua, Xian-Sheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we present novel sharp attention networks by adaptively sampling feature maps from convolutional neural networks for person re-identification (re-ID) problems. Due to the introduction of sampling-based attention models, the proposed approach can adaptively generate sharper attention-aware feature masks. This greatly differs from the gating-based attention mechanism that relies on soft gating functions to select the relevant features for person re-ID. In contrast, the proposed sampling-based attention mechanism allows us to effectively trim irrelevant features by enforcing the resultant feature masks to focus on the most discriminative features. It can produce sharper attentions that is more assertive in localizing subtle features relevant to re-identifying people across cameras. For this purpose, a differentiable Gumbel-Softmax sampler is employed to approximate the Bernoulli sampling to train the sharp attention networks. Extensive experimental evaluations demonstrate the superiority of this new sharp attention model for person re-ID over other related existing, published state-of-the-art works on three challenging benchmarks, including CUHK03, Market-1501, and DukeMTMC-reID.
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2018.2872503