Face alignment with cascaded semi-parametric deep greedy neural forests

•A semi-parametric cascaded regression framework for face alignment.•Hybrid model between Neural networks and Random Forests.•New training procedure for regression and evaluation.•Greedy neural forests combines high model expressivity and very fast evaluation.•Application to Small, medium and large...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2018-01, Vol.102, p.75-81
Hauptverfasser: Dapogny, Arnaud, Bailly, Kévin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•A semi-parametric cascaded regression framework for face alignment.•Hybrid model between Neural networks and Random Forests.•New training procedure for regression and evaluation.•Greedy neural forests combines high model expressivity and very fast evaluation.•Application to Small, medium and large head poses alignment. Face alignment is an active topic in computer vision, consisting in aligning a shape model on the face. To this end, most modern approaches refine the shape in a cascaded manner, starting from an initial guess. Those shape updates can either be applied in the feature point space (i.e. explicit updates) or in a low-dimensional, parametric space. In this paper, we propose a semi-parametric cascade that first aligns a parametric shape, then captures more fine-grained deformations of an explicit shape. For the purpose of learning shape updates at each cascade stage, we introduce a deep greedy neural forest (GNF) model, which is an improved version of deep neural forest (NF). GNF appears as an ideal regressor for face alignment, as it combines differentiability, high expressivity and fast evaluation runtime. The proposed framework is very fast and achieves high accuracies on multiple challenging benchmarks, including small, medium and large pose experiments.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2017.12.010