CATFace: Cross-Attribute-Guided Transformer with Self-Attention Distillation for Low-Quality Face Recognition
Although face recognition (FR) has achieved great success in recent years, it is still challenging to accurately recognize faces in low-quality images due to the obscured facial details. Nevertheless, it is often feasible to make predictions about specific soft biometric (SB) attributes, such as gen...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Although face recognition (FR) has achieved great success in recent years, it
is still challenging to accurately recognize faces in low-quality images due to
the obscured facial details. Nevertheless, it is often feasible to make
predictions about specific soft biometric (SB) attributes, such as gender, and
baldness even in dealing with low-quality images. In this paper, we propose a
novel multi-branch neural network that leverages SB attribute information to
boost the performance of FR. To this end, we propose a cross-attribute-guided
transformer fusion (CATF) module that effectively captures the long-range
dependencies and relationships between FR and SB feature representations. The
synergy created by the reciprocal flow of information in the dual
cross-attention operations of the proposed CATF module enhances the performance
of FR. Furthermore, we introduce a novel self-attention distillation framework
that effectively highlights crucial facial regions, such as landmarks by
aligning low-quality images with those of their high-quality counterparts in
the feature space. The proposed self-attention distillation regularizes our
network to learn a unified quality-invariant feature representation in
unconstrained environments. We conduct extensive experiments on various FR
benchmarks varying in quality. Experimental results demonstrate the superiority
of our FR method compared to state-of-the-art FR studies. |
---|---|
DOI: | 10.48550/arxiv.2401.03037 |