Dynamic Feature Matching for Partial Face Recognition
Partial face recognition (PFR) in an unconstrained environment is a very important task, especially in situations where partial face images are likely to be captured due to occlusions, out-of-view, and large viewing angle, e.g., video surveillance and mobile devices. However, little attention has be...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on image processing 2019-02, Vol.28 (2), p.791-802 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Partial face recognition (PFR) in an unconstrained environment is a very important task, especially in situations where partial face images are likely to be captured due to occlusions, out-of-view, and large viewing angle, e.g., video surveillance and mobile devices. However, little attention has been paid to PFR so far and thus, the problem of recognizing an arbitrary patch of a face image remains largely unsolved. This paper proposes a novel partial face recognition approach, called dynamic feature matching (DFM), which combines fully convolutional networks and sparse representation classification (SRC) to address partial face recognition problem regardless of various face sizes. DFM does not require prior position information of partial faces against a holistic face. By sharing computation, the feature maps are calculated from the entire input image once, which yields a significant speedup. Experimental results demonstrate the effectiveness and advantages of DFM in comparison with state-of-the-art PFR methods on several partial face databases, including CAISA-NIR-Distance, CASIA-NIR-Mobile, and LFW Databases. The performance of DFM is also impressive in partial person re-identification on Partial RE-ID and iLIDS databases. The source code of DFM can be found at https://github.com/lingxiao-he/dfmnew. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2018.2870946 |