RainyScape: Unsupervised Rainy Scene Reconstruction using Decoupled Neural Rendering
We propose RainyScape, an unsupervised framework for reconstructing clean scenes from a collection of multi-view rainy images. RainyScape consists of two main modules: a neural rendering module and a rain-prediction module that incorporates a predictor network and a learnable latent embedding that c...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose RainyScape, an unsupervised framework for reconstructing clean
scenes from a collection of multi-view rainy images. RainyScape consists of two
main modules: a neural rendering module and a rain-prediction module that
incorporates a predictor network and a learnable latent embedding that captures
the rain characteristics of the scene. Specifically, based on the spectral bias
property of neural networks, we first optimize the neural rendering pipeline to
obtain a low-frequency scene representation. Subsequently, we jointly optimize
the two modules, driven by the proposed adaptive direction-sensitive
gradient-based reconstruction loss, which encourages the network to distinguish
between scene details and rain streaks, facilitating the propagation of
gradients to the relevant components. Extensive experiments on both the classic
neural radiance field and the recently proposed 3D Gaussian splatting
demonstrate the superiority of our method in effectively eliminating rain
streaks and rendering clean images, achieving state-of-the-art performance. The
constructed high-quality dataset and source code will be publicly available. |
---|---|
DOI: | 10.48550/arxiv.2404.11401 |