Pyramid and Similarity Based Feature Enhancement Network for Person Re-identification

Person re-identifification (Re-ID) has becomes an increasingly important task due to its wide applications in the field of intelligent video surveillance. However, Because of the camera shooting angle, the different proportions and pose changes of pedestrians in images are also different, such probl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. Conference series 2021-04, Vol.1880 (1), p.12020
Hauptverfasser: Chu, Chengguo, Qi, Meibin, Jiang, Jianguo, Chen, Cuiqun, Wu, Jingjing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Person re-identifification (Re-ID) has becomes an increasingly important task due to its wide applications in the field of intelligent video surveillance. However, Because of the camera shooting angle, the different proportions and pose changes of pedestrians in images are also different, such problem substantially hinders the model’s capability on further improving feature extraction. In this paper, we propose a new end-to-end architecture named Pyramid and Similarity Based Feature Enhancement Network (PS-Net) which includes Pyramid Joint Attention(PJA) and Similarity Feature Fusion Branch(SFF) to enhance the feature maps performance. Among them, PJA contains spatial attention at different receptive fields and channel attention to enhances the discriminative area of pedestrian features by combining the advantages of two different attentions. SFF fuses the features which have different levels of information but have congruent relationships, enriches the information contained in pedestrian features and generates more robust feature. Extensive experiments have been conducted to validate the superiority of our PS-Net for Re-ID over a wide variety of state-of-the-art methods on two large-scale datasets: Market-1501 and DukeMTMC-ReID.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1880/1/012020