Cluster-Guided Asymmetric Contrastive Learning for Unsupervised Person Re-Identification

Unsupervised person re-identification (Re-ID) aims to match pedestrian images from different camera views in an unsupervised setting. Existing methods for unsupervised person Re-ID are usually built upon the pseudo labels from clustering. However, the result of clustering depends heavily on the qual...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2022, Vol.31, p.3606-3617
Hauptverfasser: Li, Mingkun, Li, Chun-Guang, Guo, Jun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Unsupervised person re-identification (Re-ID) aims to match pedestrian images from different camera views in an unsupervised setting. Existing methods for unsupervised person Re-ID are usually built upon the pseudo labels from clustering. However, the result of clustering depends heavily on the quality of the learned features, which are overwhelmingly dominated by colors in images. In this paper, we attempt to suppress the negative dominating influence of colors to learn more effective features for unsupervised person Re-ID. Specifically, we propose a Cluster-guided Asymmetric Contrastive Learning (CACL) approach for unsupervised person Re-ID, in which clustering result is leveraged to guide the feature learning in a properly designed asymmetric contrastive learning framework. In CACL, both instance-level and cluster-level contrastive learning are employed to help the siamese network learn discriminant features with respect to the clustering result within and between different data augmentation views, respectively. In addition, we also present a cluster refinement method, and validate that the cluster refinement step helps CACL significantly. Extensive experiments conducted on three benchmark datasets demonstrate the superior performance of our proposal.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2022.3173163