RGB-IR cross-modality person ReID based on teacher-student GAN model

•Teacher-Student model to minimize the different modality gap.•Joint cycle-consistency GAN to generate the corresponding image pairs.•Only use main backbone in the test stage for efficiency.•Numerous experiments conducted have proven the effectiveness of proposed model. RGB-Infrared (RGB-IR) person...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2021-10, Vol.150, p.155-161
Hauptverfasser: Zhang, Ziyue, Jiang, Shuai, Huang, Congzhentao, Li, Yang, Xu, Richard Yi Da
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Teacher-Student model to minimize the different modality gap.•Joint cycle-consistency GAN to generate the corresponding image pairs.•Only use main backbone in the test stage for efficiency.•Numerous experiments conducted have proven the effectiveness of proposed model. RGB-Infrared (RGB-IR) person re-identification (ReID) is a technology where the system can automatically identify the same person appearing at different parts of a video when light is unavailable. The critical challenge of this task is the cross-modality gap of features under different modalities. To solve this challenge, we proposed a Teacher-Student GAN model (TS-GAN) to adopt different domains and guide the ReID backbone. (1) In order to get corresponding RGB-IR image pairs, the RGB-IR Generative Adversarial Network (GAN) was used to generate IR images. (2) To kick-start the training of identities, a ReID Teacher module was trained under IR modality person images, which is then used to guide its Student counterpart in training. (3) Likewise, to better adapt different domain features and enhance model ReID performance, three Teacher-Student loss functions were used. Unlike other GAN based models, the proposed model only needs the backbone module at the test stage, making it more efficient and resource-saving. To showcase our model’s capability, we did extensive experiments on the newly-released SYSU-MM01 and RegDB RGB-IR Re-ID benchmark and achieved superior performance to the state-of-the-art with 47.4% mAP and 69.4% mAP respectively.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2021.07.006