Recognizing blurred, nonfrontal, illumination, and expression variant partially occluded faces

The focus of this paper is on the problem of recognizing faces across space-varying motion blur, changes in pose, illumination, and expression, as well as partial occlusion, when only a single image per subject is available in the gallery. We show how the blur, incurred due to relative motion betwee...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of the Optical Society of America. A, Optics, image science, and vision Optics, image science, and vision, 2016-09, Vol.33 (9), p.1887-1900
Hauptverfasser: Punnappurath, Abhijith, Rajagopalan, Ambasamudram Narayanan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The focus of this paper is on the problem of recognizing faces across space-varying motion blur, changes in pose, illumination, and expression, as well as partial occlusion, when only a single image per subject is available in the gallery. We show how the blur, incurred due to relative motion between the camera and the subject during exposure, can be estimated from the alpha matte of pixels that straddle the boundary between the face and the background. We also devise a strategy to automatically generate the trimap required for matte estimation. Having computed the motion via the matte of the probe, we account for pose variations by synthesizing from the intensity image of the frontal gallery a face image that matches the pose of the probe. To handle illumination, expression variations, and partial occlusion, we model the probe as a linear combination of nine blurred illumination basis images in the synthesized nonfrontal pose, plus a sparse occlusion. We also advocate a recognition metric that capitalizes on the sparsity of the occluded pixels. The performance of our method is extensively validated on synthetic as well as real face data.
ISSN:1084-7529
1520-8532
DOI:10.1364/JOSAA.33.001887