ENTED: Enhanced Neural Texture Extraction and Distribution for Reference-based Blind Face Restoration
We present ENTED, a new framework for blind face restoration that aims to restore high-quality and realistic portrait images. Our method involves repairing a single degraded input image using a high-quality reference image. We utilize a texture extraction and distribution framework to transfer high-...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We present ENTED, a new framework for blind face restoration that aims to
restore high-quality and realistic portrait images. Our method involves
repairing a single degraded input image using a high-quality reference image.
We utilize a texture extraction and distribution framework to transfer
high-quality texture features between the degraded input and reference image.
However, the StyleGAN-like architecture in our framework requires high-quality
latent codes to generate realistic images. The latent code extracted from the
degraded input image often contains corrupted features, making it difficult to
align the semantic information from the input with the high-quality textures
from the reference. To overcome this challenge, we employ two special
techniques. The first technique, inspired by vector quantization, replaces
corrupted semantic features with high-quality code words. The second technique
generates style codes that carry photorealistic texture information from a more
informative latent space developed using the high-quality features in the
reference image's manifold. Extensive experiments conducted on synthetic and
real-world datasets demonstrate that our method produces results with more
realistic contextual details and outperforms state-of-the-art methods. A
thorough ablation study confirms the effectiveness of each proposed module. |
---|---|
DOI: | 10.48550/arxiv.2401.06978 |