Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation
Monte‐Carlo rendering requires determining the visibility between scene points as the most common and compute intense operation to establish paths between camera and light source. Unfortunately, many tests reveal occlusions and the corresponding paths do not contribute to the final image. In this wo...
Gespeichert in:
Veröffentlicht in: | Computer graphics forum 2020-10, Vol.39 (7), p.205-217 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 217 |
---|---|
container_issue | 7 |
container_start_page | 205 |
container_title | Computer graphics forum |
container_volume | 39 |
creator | Guo, Jerry Jinfeng Eisemann, Martin Eisemann, Elmar |
description | Monte‐Carlo rendering requires determining the visibility between scene points as the most common and compute intense operation to establish paths between camera and light source. Unfortunately, many tests reveal occlusions and the corresponding paths do not contribute to the final image. In this work, we present next event estimation++ (NEE++): a visibility mapping technique to perform visibility tests in a more informed way by caching voxel to voxel visibility probabilities. We show two scenarios: Russian roulette style rejection of visibility tests and direct importance sampling of the visibility. We show applications to next event estimation and light sampling in a uni‐directional path tracer, and light‐subpath sampling in Bi‐Directional Path Tracing. The technique is simple to implement, easy to add to existing rendering systems, and comes at almost no cost, as the required information can be directly extracted from the rendering process itself. It discards up to 80% of visibility tests on average, while reducing variance by ∼20% compared to other state‐of‐the‐art light sampling techniques with the same number of samples. It gracefully handles complex scenes with efficiency similar to Metropolis light transport techniques but with a more uniform convergence. |
doi_str_mv | 10.1111/cgf.14138 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2463486604</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2463486604</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2078-1cf6b4080ec1d48a63d271e10e47b5044faea1092a7cd8f1c4844df68919cda23</originalsourceid><addsrcrecordid>eNp1kDFPwzAQhS0EEqUw8A8sMaEqrZ04tsOGqrQgFRgorJbr2MVVmgTbBfrvcRtWbri74Xv3dA-Aa4zGONZErc0YE5zxEzDAhLKE07w4BQOE485Qnp-DC-83CCHCaD4Ay2f9E2D5pZvYfbBbGWzbjEZ38N16u7K1DXv4JLvONmtoWgdLY6yyB3xh1x8BLp1sfNe6AF_tdlcf5ZfgzMja66u_OQRvs3I5fUgWL_PH6f0iUSliPMHK0BVBHGmFK8IlzaqUYY2RJmyVI0KM1BKjIpVMVdxgRTghlaG8wIWqZJoNwU1_t3Pt5077IDbtzjXRUqSEZoRTikikbntKudZ7p43oXPzT7QVG4hCaiKGJY2iRnfTst631_n9QTOezXvEL4DJtaQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2463486604</pqid></control><display><type>article</type><title>Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation</title><source>Wiley Online Library Journals Frontfile Complete</source><source>Business Source Complete</source><creator>Guo, Jerry Jinfeng ; Eisemann, Martin ; Eisemann, Elmar</creator><creatorcontrib>Guo, Jerry Jinfeng ; Eisemann, Martin ; Eisemann, Elmar</creatorcontrib><description>Monte‐Carlo rendering requires determining the visibility between scene points as the most common and compute intense operation to establish paths between camera and light source. Unfortunately, many tests reveal occlusions and the corresponding paths do not contribute to the final image. In this work, we present next event estimation++ (NEE++): a visibility mapping technique to perform visibility tests in a more informed way by caching voxel to voxel visibility probabilities. We show two scenarios: Russian roulette style rejection of visibility tests and direct importance sampling of the visibility. We show applications to next event estimation and light sampling in a uni‐directional path tracer, and light‐subpath sampling in Bi‐Directional Path Tracing. The technique is simple to implement, easy to add to existing rendering systems, and comes at almost no cost, as the required information can be directly extracted from the rendering process itself. It discards up to 80% of visibility tests on average, while reducing variance by ∼20% compared to other state‐of‐the‐art light sampling techniques with the same number of samples. It gracefully handles complex scenes with efficiency similar to Metropolis light transport techniques but with a more uniform convergence.</description><identifier>ISSN: 0167-7055</identifier><identifier>EISSN: 1467-8659</identifier><identifier>DOI: 10.1111/cgf.14138</identifier><language>eng</language><publisher>Oxford: Blackwell Publishing Ltd</publisher><subject>bi‐directional path tracing ; Caching ; CCS Concepts ; Computing methodologies → Ray tracing ; Importance sampling ; Light ; Light sources ; path tracing ; Rendering ; Sampling methods ; shadowray ; Visibility ; Visibility maps</subject><ispartof>Computer graphics forum, 2020-10, Vol.39 (7), p.205-217</ispartof><rights>2020 The Author(s) Computer Graphics Forum © 2020 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.</rights><rights>2020 The Eurographics Association and John Wiley & Sons Ltd.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c2078-1cf6b4080ec1d48a63d271e10e47b5044faea1092a7cd8f1c4844df68919cda23</cites><orcidid>0000-0003-4153-065X ; 0000-0002-8065-4084 ; 0000-0002-9470-5874</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fcgf.14138$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fcgf.14138$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,776,780,1411,27901,27902,45550,45551</link.rule.ids></links><search><creatorcontrib>Guo, Jerry Jinfeng</creatorcontrib><creatorcontrib>Eisemann, Martin</creatorcontrib><creatorcontrib>Eisemann, Elmar</creatorcontrib><title>Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation</title><title>Computer graphics forum</title><description>Monte‐Carlo rendering requires determining the visibility between scene points as the most common and compute intense operation to establish paths between camera and light source. Unfortunately, many tests reveal occlusions and the corresponding paths do not contribute to the final image. In this work, we present next event estimation++ (NEE++): a visibility mapping technique to perform visibility tests in a more informed way by caching voxel to voxel visibility probabilities. We show two scenarios: Russian roulette style rejection of visibility tests and direct importance sampling of the visibility. We show applications to next event estimation and light sampling in a uni‐directional path tracer, and light‐subpath sampling in Bi‐Directional Path Tracing. The technique is simple to implement, easy to add to existing rendering systems, and comes at almost no cost, as the required information can be directly extracted from the rendering process itself. It discards up to 80% of visibility tests on average, while reducing variance by ∼20% compared to other state‐of‐the‐art light sampling techniques with the same number of samples. It gracefully handles complex scenes with efficiency similar to Metropolis light transport techniques but with a more uniform convergence.</description><subject>bi‐directional path tracing</subject><subject>Caching</subject><subject>CCS Concepts</subject><subject>Computing methodologies → Ray tracing</subject><subject>Importance sampling</subject><subject>Light</subject><subject>Light sources</subject><subject>path tracing</subject><subject>Rendering</subject><subject>Sampling methods</subject><subject>shadowray</subject><subject>Visibility</subject><subject>Visibility maps</subject><issn>0167-7055</issn><issn>1467-8659</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp1kDFPwzAQhS0EEqUw8A8sMaEqrZ04tsOGqrQgFRgorJbr2MVVmgTbBfrvcRtWbri74Xv3dA-Aa4zGONZErc0YE5zxEzDAhLKE07w4BQOE485Qnp-DC-83CCHCaD4Ay2f9E2D5pZvYfbBbGWzbjEZ38N16u7K1DXv4JLvONmtoWgdLY6yyB3xh1x8BLp1sfNe6AF_tdlcf5ZfgzMja66u_OQRvs3I5fUgWL_PH6f0iUSliPMHK0BVBHGmFK8IlzaqUYY2RJmyVI0KM1BKjIpVMVdxgRTghlaG8wIWqZJoNwU1_t3Pt5077IDbtzjXRUqSEZoRTikikbntKudZ7p43oXPzT7QVG4hCaiKGJY2iRnfTst631_n9QTOezXvEL4DJtaQ</recordid><startdate>202010</startdate><enddate>202010</enddate><creator>Guo, Jerry Jinfeng</creator><creator>Eisemann, Martin</creator><creator>Eisemann, Elmar</creator><general>Blackwell Publishing Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-4153-065X</orcidid><orcidid>https://orcid.org/0000-0002-8065-4084</orcidid><orcidid>https://orcid.org/0000-0002-9470-5874</orcidid></search><sort><creationdate>202010</creationdate><title>Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation</title><author>Guo, Jerry Jinfeng ; Eisemann, Martin ; Eisemann, Elmar</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2078-1cf6b4080ec1d48a63d271e10e47b5044faea1092a7cd8f1c4844df68919cda23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>bi‐directional path tracing</topic><topic>Caching</topic><topic>CCS Concepts</topic><topic>Computing methodologies → Ray tracing</topic><topic>Importance sampling</topic><topic>Light</topic><topic>Light sources</topic><topic>path tracing</topic><topic>Rendering</topic><topic>Sampling methods</topic><topic>shadowray</topic><topic>Visibility</topic><topic>Visibility maps</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Guo, Jerry Jinfeng</creatorcontrib><creatorcontrib>Eisemann, Martin</creatorcontrib><creatorcontrib>Eisemann, Elmar</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Computer graphics forum</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Guo, Jerry Jinfeng</au><au>Eisemann, Martin</au><au>Eisemann, Elmar</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation</atitle><jtitle>Computer graphics forum</jtitle><date>2020-10</date><risdate>2020</risdate><volume>39</volume><issue>7</issue><spage>205</spage><epage>217</epage><pages>205-217</pages><issn>0167-7055</issn><eissn>1467-8659</eissn><abstract>Monte‐Carlo rendering requires determining the visibility between scene points as the most common and compute intense operation to establish paths between camera and light source. Unfortunately, many tests reveal occlusions and the corresponding paths do not contribute to the final image. In this work, we present next event estimation++ (NEE++): a visibility mapping technique to perform visibility tests in a more informed way by caching voxel to voxel visibility probabilities. We show two scenarios: Russian roulette style rejection of visibility tests and direct importance sampling of the visibility. We show applications to next event estimation and light sampling in a uni‐directional path tracer, and light‐subpath sampling in Bi‐Directional Path Tracing. The technique is simple to implement, easy to add to existing rendering systems, and comes at almost no cost, as the required information can be directly extracted from the rendering process itself. It discards up to 80% of visibility tests on average, while reducing variance by ∼20% compared to other state‐of‐the‐art light sampling techniques with the same number of samples. It gracefully handles complex scenes with efficiency similar to Metropolis light transport techniques but with a more uniform convergence.</abstract><cop>Oxford</cop><pub>Blackwell Publishing Ltd</pub><doi>10.1111/cgf.14138</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-4153-065X</orcidid><orcidid>https://orcid.org/0000-0002-8065-4084</orcidid><orcidid>https://orcid.org/0000-0002-9470-5874</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0167-7055 |
ispartof | Computer graphics forum, 2020-10, Vol.39 (7), p.205-217 |
issn | 0167-7055 1467-8659 |
language | eng |
recordid | cdi_proquest_journals_2463486604 |
source | Wiley Online Library Journals Frontfile Complete; Business Source Complete |
subjects | bi‐directional path tracing Caching CCS Concepts Computing methodologies → Ray tracing Importance sampling Light Light sources path tracing Rendering Sampling methods shadowray Visibility Visibility maps |
title | Next Event Estimation++: Visibility Mapping for Efficient Light Transport Simulation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T13%3A50%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Next%20Event%20Estimation++:%20Visibility%20Mapping%20for%20Efficient%20Light%20Transport%20Simulation&rft.jtitle=Computer%20graphics%20forum&rft.au=Guo,%20Jerry%20Jinfeng&rft.date=2020-10&rft.volume=39&rft.issue=7&rft.spage=205&rft.epage=217&rft.pages=205-217&rft.issn=0167-7055&rft.eissn=1467-8659&rft_id=info:doi/10.1111/cgf.14138&rft_dat=%3Cproquest_cross%3E2463486604%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2463486604&rft_id=info:pmid/&rfr_iscdi=true |