Face super-resolution via nonlinear adaptive representation
Face super-resolution is an example of super-resolution technique, where it takes one or multiple observed low-resolution images and then converts them to high-resolution image. Learning-based face super-resolution depends on prior information from training database. Most patch-based face super-reso...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2020-08, Vol.32 (15), p.11637-11649 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Face super-resolution is an example of super-resolution technique, where it takes one or multiple observed low-resolution images and then converts them to high-resolution image. Learning-based face super-resolution depends on prior information from training database. Most patch-based face super-resolution methods assume the homoscedasticity of the reconstruction error in an objective function and solve it with regularized least squares. In fact, the heteroscedasticity generally exists both in the observed data and in the reconstruction error. To access accurate prior information, we propose a nonlinear adaptive representation (NAR) scheme for hallucinating the individuality of facial images. First, we apply a weighted regularization process to both the reconstruction error and representation coefficients terms to eliminate the heteroscedasticity of the input data. Then, the contextual patches and residual high-frequency components are explored to enrich the prior information. Moreover, a nonlinear extension of the adaptive representation fully utilizes accurate prior information to achieve better reconstruction performance. Experiments on the CAS-PEAL-R1, Webface and LDHF databases show that NAR outperforms some state-of-the-art face super-resolution methods including some deep learning-based approaches. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-019-04652-5 |