OptiViewNeRF: Optimizing 3D reconstruction via batch view selection and scene uncertainty in Neural Radiance Fields
In situations with a limited number of posed images, choosing the most suitable viewpoints becomes crucial for accurate Neural Radiance Fields (NeRF) modeling. Current approaches for view selection often rely on heuristic methods or are computationally intensive. To address these challenges, we intr...
Gespeichert in:
Veröffentlicht in: | International journal of applied earth observation and geoinformation 2025-02, Vol.136, p.104306, Article 104306 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In situations with a limited number of posed images, choosing the most suitable viewpoints becomes crucial for accurate Neural Radiance Fields (NeRF) modeling. Current approaches for view selection often rely on heuristic methods or are computationally intensive. To address these challenges, we introduce a new framework, OptiViewNeRF, which leverages scene uncertainty to guide the view selection process. Initially, an uncertainty estimation model of the entire scene is developed based on a preliminary NeRF model. This model then informs the selection of new perception viewpoints using a batch view selection strategy, allowing the entire process to be completed in a single iteration. By selecting viewpoints that provide informative data, this approach improves novel view synthesis results and accurately reconstructs 3D scenes. Experimental results on two selected datasets show that the proposed method effectively identifies informative viewpoints, resulting in more accurate scene reconstructions compared to baseline and state-of-the-art methods.
•A new viewpoint selection framework for optimal reconstruction is proposed.•Quantifies scene-level uncertainty, avoiding the need to remodel NeRF.•Introduces a batch view strategy, completing view selection in a single round.•Some related resources, including datasets and codes, are provided. |
---|---|
ISSN: | 1569-8432 |
DOI: | 10.1016/j.jag.2024.104306 |