NeX: Real-time View Synthesis with Neural Basis Expansion
We present NeX, a new approach to novel view synthesis based on enhancements of multiplane image (MPI) that can reproduce next-level view-dependent effects -- in real time. Unlike traditional MPI that uses a set of simple RGB$\alpha$ planes, our technique models view-dependent effects by instead par...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We present NeX, a new approach to novel view synthesis based on enhancements
of multiplane image (MPI) that can reproduce next-level view-dependent effects
-- in real time. Unlike traditional MPI that uses a set of simple RGB$\alpha$
planes, our technique models view-dependent effects by instead parameterizing
each pixel as a linear combination of basis functions learned from a neural
network. Moreover, we propose a hybrid implicit-explicit modeling strategy that
improves upon fine detail and produces state-of-the-art results. Our method is
evaluated on benchmark forward-facing datasets as well as our newly-introduced
dataset designed to test the limit of view-dependent modeling with
significantly more challenging effects such as rainbow reflections on a CD. Our
method achieves the best overall scores across all major metrics on these
datasets with more than 1000$\times$ faster rendering time than the state of
the art. For real-time demos, visit https://nex-mpi.github.io/ |
---|---|
DOI: | 10.48550/arxiv.2103.05606 |