3D-GAT: 3D-Guided adversarial transform network for person re-identification in unseen domains

Person Re-identification (ReID) has witnessed remarkable improvements in the past couple of years. However, its applications in real-world scenarios are limited by the disparity among different cameras and datasets. In general, it remains challenging to generalize ReID algorithms from one domain to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 2021-04, Vol.112, p.107799, Article 107799
Hauptverfasser: Zhang, Hengheng, Li, Ying, Zhuang, Zijie, Xie, Lingxi, Tian, Qi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Person Re-identification (ReID) has witnessed remarkable improvements in the past couple of years. However, its applications in real-world scenarios are limited by the disparity among different cameras and datasets. In general, it remains challenging to generalize ReID algorithms from one domain to another, especially when the target domain is unknown. To solve this issue, we develop a 3D-guided adversarial transform (3D-GAT) network which explores the transfer ability of source training data to facilitate learning domain-independent knowledge. Being aware of a 3D model and human poses, 3D-GAT makes use of image-to-image translation to synthesize person images in different conditions whilst preserving features for identification as much as possible. With these augmented training data, it is easier for ReID approaches to perceive how a person can appear differently under varying viewpoints and poses, most of which are not seen in the training data, and thus achieve higher ReID accuracy especially in an unknown domain. Extensive experiments conducted on Market-1501, DukeMTMC-reID and CUHK03 demonstrate the effectiveness of our proposed approach, which is competitive to the baseline models in the original dataset and sets the new state-of-the-art in direct transfer to other datasets.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2020.107799