GauFRe: Gaussian Deformation Fields for Real-time Dynamic Novel View Synthesis
We propose a method that achieves state-of-the-art rendering quality and efficiency on monocular dynamic scene reconstruction using deformable 3D Gaussians. Implicit deformable representations commonly model motion with a canonical space and time-dependent backward-warping deformation field. Our met...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose a method that achieves state-of-the-art rendering quality and
efficiency on monocular dynamic scene reconstruction using deformable 3D
Gaussians. Implicit deformable representations commonly model motion with a
canonical space and time-dependent backward-warping deformation field. Our
method, GauFRe, uses a forward-warping deformation to explicitly model
non-rigid transformations of scene geometry. Specifically, we propose a
template set of 3D Gaussians residing in a canonical space, and a
time-dependent forward-warping deformation field to model dynamic objects.
Additionally, we tailor a 3D Gaussian-specific static component supported by an
inductive bias-aware initialization approach which allows the deformation field
to focus on moving scene regions, improving the rendering of complex real-world
motion. The differentiable pipeline is optimized end-to-end with a
self-supervised rendering loss. Experiments show our method achieves
competitive results and higher efficiency than both previous state-of-the-art
NeRF and Gaussian-based methods. For real-world scenes, GauFRe can train in ~20
mins and offer 96 FPS real-time rendering on an RTX 3090 GPU. Project website:
https://lynl7130.github.io/gaufre/index.html |
---|---|
DOI: | 10.48550/arxiv.2312.11458 |