Super-resolution for face image with an improved K-NN search strategy
Recently, neighbor embedding based face super-resolution (SR) methods have shown the ability for achieving high-quality face images, those methods are based on the assumption that the same neighborhoods are preserved in both low-resolution (LR) training set and high-resolution (HR) training set. How...
Gespeichert in:
Veröffentlicht in: | China communications 2016-04, Vol.13 (4), p.151-161 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, neighbor embedding based face super-resolution (SR) methods have shown the ability for achieving high-quality face images, those methods are based on the assumption that the same neighborhoods are preserved in both low-resolution (LR) training set and high-resolution (HR) training set. However, due to the “one-to-many” mapping between the LR image and HR ones in practice, the neighborhood relationship of the LR patch in LR space is quite different with that of the HR counterpart, that is to say the neighborhood relationship obtained is not true. In this paper, we explore a novel and effective re-identified K-nearest neighbor (RIKNN) method to search neighbors of LR patch. Compared with other methods, our method uses the geometrical information of LR manifold and HR manifold simultaneously. In particular, it searches K-NN of LR patch in the LR space and refines the searching results by re-identifying in the HR space, thus giving rise to accurate K-NN and improved performance. A statistical analysis of the influence of the training set size and nearest neighbor number is given, experimental results on some public face databases show the superiority of our proposed scheme over state-of-the-art face hallucination approaches in terms of subjective and objective results as well as computational complexity. |
---|---|
ISSN: | 1673-5447 |
DOI: | 10.1109/CC.2016.7464132 |