Novel View Synthesis for High-fidelity Headshot Scenes
Rendering scenes with a high-quality human face from arbitrary viewpoints is a practical and useful technique for many real-world applications. Recently, Neural Radiance Fields (NeRF), a rendering technique that uses neural networks to approximate classical ray tracing, have been considered as one o...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Rendering scenes with a high-quality human face from arbitrary viewpoints is
a practical and useful technique for many real-world applications. Recently,
Neural Radiance Fields (NeRF), a rendering technique that uses neural networks
to approximate classical ray tracing, have been considered as one of the
promising approaches for synthesizing novel views from a sparse set of images.
We find that NeRF can render new views while maintaining geometric consistency,
but it does not properly maintain skin details, such as moles and pores. These
details are important particularly for faces because when we look at an image
of a face, we are much more sensitive to details than when we look at other
objects. On the other hand, 3D Morpable Models (3DMMs) based on traditional
meshes and textures can perform well in terms of skin detail despite that it
has less precise geometry and cannot cover the head and the entire scene with
background. Based on these observations, we propose a method to use both NeRF
and 3DMM to synthesize a high-fidelity novel view of a scene with a face. Our
method learns a Generative Adversarial Network (GAN) to mix a NeRF-synthesized
image and a 3DMM-rendered image and produces a photorealistic scene with a face
preserving the skin details. Experiments with various real-world scenes
demonstrate the effectiveness of our approach. The code will be available on
https://github.com/showlab/headshot . |
---|---|
DOI: | 10.48550/arxiv.2205.15595 |