Pixel-GS: Density Control with Pixel-aware Gradient for 3D Gaussian Splatting

3D Gaussian Splatting (3DGS) has demonstrated impressive novel view synthesis results while advancing real-time rendering performance. However, it relies heavily on the quality of the initial point cloud, resulting in blurring and needle-like artifacts in areas with insufficient initializing points....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-03
Hauptverfasser: Zhang, Zheng, Hu, Wenbo, Lao, Yixing, He, Tong, Zhao, Hengshuang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Zhang, Zheng
Hu, Wenbo
Lao, Yixing
He, Tong
Zhao, Hengshuang
description 3D Gaussian Splatting (3DGS) has demonstrated impressive novel view synthesis results while advancing real-time rendering performance. However, it relies heavily on the quality of the initial point cloud, resulting in blurring and needle-like artifacts in areas with insufficient initializing points. This is mainly attributed to the point cloud growth condition in 3DGS that only considers the average gradient magnitude of points from observable views, thereby failing to grow for large Gaussians that are observable for many viewpoints while many of them are only covered in the boundaries. To this end, we propose a novel method, named Pixel-GS, to take into account the number of pixels covered by the Gaussian in each view during the computation of the growth condition. We regard the covered pixel numbers as the weights to dynamically average the gradients from different views, such that the growth of large Gaussians can be prompted. As a result, points within the areas with insufficient initializing points can be grown more effectively, leading to a more accurate and detailed reconstruction. In addition, we propose a simple yet effective strategy to scale the gradient field according to the distance to the camera, to suppress the growth of floaters near the camera. Extensive experiments both qualitatively and quantitatively demonstrate that our method achieves state-of-the-art rendering quality while maintaining real-time rendering speed, on the challenging Mip-NeRF 360 and Tanks & Temples datasets.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2986555299</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2986555299</sourcerecordid><originalsourceid>FETCH-proquest_journals_29865552993</originalsourceid><addsrcrecordid>eNqNytEKgjAUgOERBEn5Dge6Fmxrpt1q2U0Q1L0caNZkbLYzsd6-oB6gq__i-ycs4kKsknzN-YzFRF2apjzbcClFxI4n_VQmqc9bqJQlHV5QOhu8MzDqcIcv44heQe3xqpUN0DoPooIaByKNFs69wRC0vS3YtEVDKv51zpb73aU8JL13j0FRaDo3ePuhhhd5JqXkRSH-u94Lcj0G</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2986555299</pqid></control><display><type>article</type><title>Pixel-GS: Density Control with Pixel-aware Gradient for 3D Gaussian Splatting</title><source>Freely Accessible Journals</source><creator>Zhang, Zheng ; Hu, Wenbo ; Lao, Yixing ; He, Tong ; Zhao, Hengshuang</creator><creatorcontrib>Zhang, Zheng ; Hu, Wenbo ; Lao, Yixing ; He, Tong ; Zhao, Hengshuang</creatorcontrib><description>3D Gaussian Splatting (3DGS) has demonstrated impressive novel view synthesis results while advancing real-time rendering performance. However, it relies heavily on the quality of the initial point cloud, resulting in blurring and needle-like artifacts in areas with insufficient initializing points. This is mainly attributed to the point cloud growth condition in 3DGS that only considers the average gradient magnitude of points from observable views, thereby failing to grow for large Gaussians that are observable for many viewpoints while many of them are only covered in the boundaries. To this end, we propose a novel method, named Pixel-GS, to take into account the number of pixels covered by the Gaussian in each view during the computation of the growth condition. We regard the covered pixel numbers as the weights to dynamically average the gradients from different views, such that the growth of large Gaussians can be prompted. As a result, points within the areas with insufficient initializing points can be grown more effectively, leading to a more accurate and detailed reconstruction. In addition, we propose a simple yet effective strategy to scale the gradient field according to the distance to the camera, to suppress the growth of floaters near the camera. Extensive experiments both qualitatively and quantitatively demonstrate that our method achieves state-of-the-art rendering quality while maintaining real-time rendering speed, on the challenging Mip-NeRF 360 and Tanks &amp; Temples datasets.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Blurring ; Cameras ; Image reconstruction ; Pixels ; Real time ; Rendering ; Temples</subject><ispartof>arXiv.org, 2024-03</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>781,785</link.rule.ids></links><search><creatorcontrib>Zhang, Zheng</creatorcontrib><creatorcontrib>Hu, Wenbo</creatorcontrib><creatorcontrib>Lao, Yixing</creatorcontrib><creatorcontrib>He, Tong</creatorcontrib><creatorcontrib>Zhao, Hengshuang</creatorcontrib><title>Pixel-GS: Density Control with Pixel-aware Gradient for 3D Gaussian Splatting</title><title>arXiv.org</title><description>3D Gaussian Splatting (3DGS) has demonstrated impressive novel view synthesis results while advancing real-time rendering performance. However, it relies heavily on the quality of the initial point cloud, resulting in blurring and needle-like artifacts in areas with insufficient initializing points. This is mainly attributed to the point cloud growth condition in 3DGS that only considers the average gradient magnitude of points from observable views, thereby failing to grow for large Gaussians that are observable for many viewpoints while many of them are only covered in the boundaries. To this end, we propose a novel method, named Pixel-GS, to take into account the number of pixels covered by the Gaussian in each view during the computation of the growth condition. We regard the covered pixel numbers as the weights to dynamically average the gradients from different views, such that the growth of large Gaussians can be prompted. As a result, points within the areas with insufficient initializing points can be grown more effectively, leading to a more accurate and detailed reconstruction. In addition, we propose a simple yet effective strategy to scale the gradient field according to the distance to the camera, to suppress the growth of floaters near the camera. Extensive experiments both qualitatively and quantitatively demonstrate that our method achieves state-of-the-art rendering quality while maintaining real-time rendering speed, on the challenging Mip-NeRF 360 and Tanks &amp; Temples datasets.</description><subject>Blurring</subject><subject>Cameras</subject><subject>Image reconstruction</subject><subject>Pixels</subject><subject>Real time</subject><subject>Rendering</subject><subject>Temples</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNytEKgjAUgOERBEn5Dge6Fmxrpt1q2U0Q1L0caNZkbLYzsd6-oB6gq__i-ycs4kKsknzN-YzFRF2apjzbcClFxI4n_VQmqc9bqJQlHV5QOhu8MzDqcIcv44heQe3xqpUN0DoPooIaByKNFs69wRC0vS3YtEVDKv51zpb73aU8JL13j0FRaDo3ePuhhhd5JqXkRSH-u94Lcj0G</recordid><startdate>20240322</startdate><enddate>20240322</enddate><creator>Zhang, Zheng</creator><creator>Hu, Wenbo</creator><creator>Lao, Yixing</creator><creator>He, Tong</creator><creator>Zhao, Hengshuang</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240322</creationdate><title>Pixel-GS: Density Control with Pixel-aware Gradient for 3D Gaussian Splatting</title><author>Zhang, Zheng ; Hu, Wenbo ; Lao, Yixing ; He, Tong ; Zhao, Hengshuang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_29865552993</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Blurring</topic><topic>Cameras</topic><topic>Image reconstruction</topic><topic>Pixels</topic><topic>Real time</topic><topic>Rendering</topic><topic>Temples</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Zheng</creatorcontrib><creatorcontrib>Hu, Wenbo</creatorcontrib><creatorcontrib>Lao, Yixing</creatorcontrib><creatorcontrib>He, Tong</creatorcontrib><creatorcontrib>Zhao, Hengshuang</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhang, Zheng</au><au>Hu, Wenbo</au><au>Lao, Yixing</au><au>He, Tong</au><au>Zhao, Hengshuang</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Pixel-GS: Density Control with Pixel-aware Gradient for 3D Gaussian Splatting</atitle><jtitle>arXiv.org</jtitle><date>2024-03-22</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>3D Gaussian Splatting (3DGS) has demonstrated impressive novel view synthesis results while advancing real-time rendering performance. However, it relies heavily on the quality of the initial point cloud, resulting in blurring and needle-like artifacts in areas with insufficient initializing points. This is mainly attributed to the point cloud growth condition in 3DGS that only considers the average gradient magnitude of points from observable views, thereby failing to grow for large Gaussians that are observable for many viewpoints while many of them are only covered in the boundaries. To this end, we propose a novel method, named Pixel-GS, to take into account the number of pixels covered by the Gaussian in each view during the computation of the growth condition. We regard the covered pixel numbers as the weights to dynamically average the gradients from different views, such that the growth of large Gaussians can be prompted. As a result, points within the areas with insufficient initializing points can be grown more effectively, leading to a more accurate and detailed reconstruction. In addition, we propose a simple yet effective strategy to scale the gradient field according to the distance to the camera, to suppress the growth of floaters near the camera. Extensive experiments both qualitatively and quantitatively demonstrate that our method achieves state-of-the-art rendering quality while maintaining real-time rendering speed, on the challenging Mip-NeRF 360 and Tanks &amp; Temples datasets.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-03
issn 2331-8422
language eng
recordid cdi_proquest_journals_2986555299
source Freely Accessible Journals
subjects Blurring
Cameras
Image reconstruction
Pixels
Real time
Rendering
Temples
title Pixel-GS: Density Control with Pixel-aware Gradient for 3D Gaussian Splatting
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T23%3A11%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Pixel-GS:%20Density%20Control%20with%20Pixel-aware%20Gradient%20for%203D%20Gaussian%20Splatting&rft.jtitle=arXiv.org&rft.au=Zhang,%20Zheng&rft.date=2024-03-22&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2986555299%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2986555299&rft_id=info:pmid/&rfr_iscdi=true