Active Discriminative Cross-Domain Alignment for Low-Resolution Face Recognition

In real application scenarios, the face images captured by cameras often incur blur, illumination variation, occlusion, and low-resolution (LR), which leads to a challenging problem for many real-time face recognition systems due to a big distribution difference between the captured degraded images...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.97503-97515
Hauptverfasser: Zheng, Dongdong, Zhang, Kaibing, Lu, Jian, Jing, Junfeng, Xiong, Zenggang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In real application scenarios, the face images captured by cameras often incur blur, illumination variation, occlusion, and low-resolution (LR), which leads to a challenging problem for many real-time face recognition systems due to a big distribution difference between the captured degraded images and the high-resolution (HR) gallery images. As widespread application of transfer learning in across-visual recognition, we propose a novel active discriminative cross-domain alignment (ADCDA) technique for LR face recognition method by jointly exploring both geometrical and statistical properties of the source domain and the target domain in a unique way. Specifically, the proposed ADCDA-based method contains three key components: 1) it simultaneously reduces the domain shift in both marginal distribution and conditional distribution between the source domain and the target domain; 2) it aligns the data of two domains in the common latent subspace by discriminant locality alignment (DLA); 3) it selects the representative and the diverse samples with an active learning strategy to further improve classification performance. Extensive experiments on six benchmark databases verify that the proposed method significantly outperforms other state-of-the-art predecessors.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2996796