Neural Sequence Transformation

Monte Carlo integration is a technique for numerically estimating a definite integral by stochastically sampling its integrand. These samples can be averaged to make an improved estimate, and the progressive estimates form a sequence that converges to the integral value on the limit. Unfortunately,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum 2021-10, Vol.40 (7), p.131-140
Hauptverfasser: Mukherjee, Sabyasachi, Mukherjee, Sayan, Hua, Binh‐Son, Umetani, Nobuyuki, Meister, Daniel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 140
container_issue 7
container_start_page 131
container_title Computer graphics forum
container_volume 40
creator Mukherjee, Sabyasachi
Mukherjee, Sayan
Hua, Binh‐Son
Umetani, Nobuyuki
Meister, Daniel
description Monte Carlo integration is a technique for numerically estimating a definite integral by stochastically sampling its integrand. These samples can be averaged to make an improved estimate, and the progressive estimates form a sequence that converges to the integral value on the limit. Unfortunately, the sequence of Monte Carlo estimates converges at a rate of O(), where n denotes the sample count, effectively slowing down as more samples are drawn. To overcome this, we can apply sequence transformation, which transforms one converging sequence into another with the goal of accelerating the rate of convergence. However, analytically finding such a transformation for Monte Carlo estimates can be challenging, due to both the stochastic nature of the sequence, and the complexity of the integrand. In this paper, we propose to leverage neural networks to learn sequence transformations that improve the convergence of the progressive estimates of Monte Carlo integration. We demonstrate the effectiveness of our method on several canonical 1D integration problems as well as applications in light transport simulation.
doi_str_mv 10.1111/cgf.14407
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2602939252</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2602939252</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3237-5449f35d02916a6302738da1eee559fb28ba2730a2acb148497337cb8143b1773</originalsourceid><addsrcrecordid>eNp1j09LAzEQxYMouFYPfgEpePKw7eT_5iiLrULRg_UcsmkiW7abmnSRfvtG16tzmWH4zXvzELrFMMO55vbTzzBjIM9QgZmQZSW4OkcF4DxL4PwSXaW0BQAmBS_Q3asboumm7-5rcL1103U0ffIh7syhDf01uvCmS-7mr0_Qx-JpXT-Xq7flS_24Ki0lVJacMeUp3wBRWBhBgUhabQx2znGufEOqxuQVGGJsg1nFlKRU2qbCjDZYSjpB96PuPob8SDrobRhiny01EVmVKsJJph5GysaQUnRe72O7M_GoMeif-DrH17_xMzsf2e-2c8f_QV0vF-PFCV-bWKg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2602939252</pqid></control><display><type>article</type><title>Neural Sequence Transformation</title><source>Business Source Complete</source><source>Wiley Online Library All Journals</source><creator>Mukherjee, Sabyasachi ; Mukherjee, Sayan ; Hua, Binh‐Son ; Umetani, Nobuyuki ; Meister, Daniel</creator><creatorcontrib>Mukherjee, Sabyasachi ; Mukherjee, Sayan ; Hua, Binh‐Son ; Umetani, Nobuyuki ; Meister, Daniel</creatorcontrib><description>Monte Carlo integration is a technique for numerically estimating a definite integral by stochastically sampling its integrand. These samples can be averaged to make an improved estimate, and the progressive estimates form a sequence that converges to the integral value on the limit. Unfortunately, the sequence of Monte Carlo estimates converges at a rate of O(), where n denotes the sample count, effectively slowing down as more samples are drawn. To overcome this, we can apply sequence transformation, which transforms one converging sequence into another with the goal of accelerating the rate of convergence. However, analytically finding such a transformation for Monte Carlo estimates can be challenging, due to both the stochastic nature of the sequence, and the complexity of the integrand. In this paper, we propose to leverage neural networks to learn sequence transformations that improve the convergence of the progressive estimates of Monte Carlo integration. We demonstrate the effectiveness of our method on several canonical 1D integration problems as well as applications in light transport simulation.</description><identifier>ISSN: 0167-7055</identifier><identifier>EISSN: 1467-8659</identifier><identifier>DOI: 10.1111/cgf.14407</identifier><language>eng</language><publisher>Oxford: Blackwell Publishing Ltd</publisher><subject>CCS Concepts ; Computing methodologies → Machine learning algorithms ; Convergence ; Estimates ; Mathematics of computing → Numerical analysis ; Neural networks ; Probability and statistics ; Ray tracing ; Transformations</subject><ispartof>Computer graphics forum, 2021-10, Vol.40 (7), p.131-140</ispartof><rights>2021 The Author(s) Computer Graphics Forum © 2021 The Eurographics Association and John Wiley &amp; Sons Ltd. Published by John Wiley &amp; Sons Ltd.</rights><rights>2021 The Eurographics Association and John Wiley &amp; Sons Ltd.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c3237-5449f35d02916a6302738da1eee559fb28ba2730a2acb148497337cb8143b1773</cites><orcidid>0000-0003-2185-5545 ; 0000-0002-3149-1442 ; 0000-0002-5706-8634 ; 0000-0001-8838-0455 ; 0000-0003-1251-970X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fcgf.14407$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fcgf.14407$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1417,27924,27925,45574,45575</link.rule.ids></links><search><creatorcontrib>Mukherjee, Sabyasachi</creatorcontrib><creatorcontrib>Mukherjee, Sayan</creatorcontrib><creatorcontrib>Hua, Binh‐Son</creatorcontrib><creatorcontrib>Umetani, Nobuyuki</creatorcontrib><creatorcontrib>Meister, Daniel</creatorcontrib><title>Neural Sequence Transformation</title><title>Computer graphics forum</title><description>Monte Carlo integration is a technique for numerically estimating a definite integral by stochastically sampling its integrand. These samples can be averaged to make an improved estimate, and the progressive estimates form a sequence that converges to the integral value on the limit. Unfortunately, the sequence of Monte Carlo estimates converges at a rate of O(), where n denotes the sample count, effectively slowing down as more samples are drawn. To overcome this, we can apply sequence transformation, which transforms one converging sequence into another with the goal of accelerating the rate of convergence. However, analytically finding such a transformation for Monte Carlo estimates can be challenging, due to both the stochastic nature of the sequence, and the complexity of the integrand. In this paper, we propose to leverage neural networks to learn sequence transformations that improve the convergence of the progressive estimates of Monte Carlo integration. We demonstrate the effectiveness of our method on several canonical 1D integration problems as well as applications in light transport simulation.</description><subject>CCS Concepts</subject><subject>Computing methodologies → Machine learning algorithms</subject><subject>Convergence</subject><subject>Estimates</subject><subject>Mathematics of computing → Numerical analysis</subject><subject>Neural networks</subject><subject>Probability and statistics</subject><subject>Ray tracing</subject><subject>Transformations</subject><issn>0167-7055</issn><issn>1467-8659</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNp1j09LAzEQxYMouFYPfgEpePKw7eT_5iiLrULRg_UcsmkiW7abmnSRfvtG16tzmWH4zXvzELrFMMO55vbTzzBjIM9QgZmQZSW4OkcF4DxL4PwSXaW0BQAmBS_Q3asboumm7-5rcL1103U0ffIh7syhDf01uvCmS-7mr0_Qx-JpXT-Xq7flS_24Ki0lVJacMeUp3wBRWBhBgUhabQx2znGufEOqxuQVGGJsg1nFlKRU2qbCjDZYSjpB96PuPob8SDrobRhiny01EVmVKsJJph5GysaQUnRe72O7M_GoMeif-DrH17_xMzsf2e-2c8f_QV0vF-PFCV-bWKg</recordid><startdate>202110</startdate><enddate>202110</enddate><creator>Mukherjee, Sabyasachi</creator><creator>Mukherjee, Sayan</creator><creator>Hua, Binh‐Son</creator><creator>Umetani, Nobuyuki</creator><creator>Meister, Daniel</creator><general>Blackwell Publishing Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-2185-5545</orcidid><orcidid>https://orcid.org/0000-0002-3149-1442</orcidid><orcidid>https://orcid.org/0000-0002-5706-8634</orcidid><orcidid>https://orcid.org/0000-0001-8838-0455</orcidid><orcidid>https://orcid.org/0000-0003-1251-970X</orcidid></search><sort><creationdate>202110</creationdate><title>Neural Sequence Transformation</title><author>Mukherjee, Sabyasachi ; Mukherjee, Sayan ; Hua, Binh‐Son ; Umetani, Nobuyuki ; Meister, Daniel</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3237-5449f35d02916a6302738da1eee559fb28ba2730a2acb148497337cb8143b1773</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>CCS Concepts</topic><topic>Computing methodologies → Machine learning algorithms</topic><topic>Convergence</topic><topic>Estimates</topic><topic>Mathematics of computing → Numerical analysis</topic><topic>Neural networks</topic><topic>Probability and statistics</topic><topic>Ray tracing</topic><topic>Transformations</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mukherjee, Sabyasachi</creatorcontrib><creatorcontrib>Mukherjee, Sayan</creatorcontrib><creatorcontrib>Hua, Binh‐Son</creatorcontrib><creatorcontrib>Umetani, Nobuyuki</creatorcontrib><creatorcontrib>Meister, Daniel</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Computer graphics forum</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mukherjee, Sabyasachi</au><au>Mukherjee, Sayan</au><au>Hua, Binh‐Son</au><au>Umetani, Nobuyuki</au><au>Meister, Daniel</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Neural Sequence Transformation</atitle><jtitle>Computer graphics forum</jtitle><date>2021-10</date><risdate>2021</risdate><volume>40</volume><issue>7</issue><spage>131</spage><epage>140</epage><pages>131-140</pages><issn>0167-7055</issn><eissn>1467-8659</eissn><abstract>Monte Carlo integration is a technique for numerically estimating a definite integral by stochastically sampling its integrand. These samples can be averaged to make an improved estimate, and the progressive estimates form a sequence that converges to the integral value on the limit. Unfortunately, the sequence of Monte Carlo estimates converges at a rate of O(), where n denotes the sample count, effectively slowing down as more samples are drawn. To overcome this, we can apply sequence transformation, which transforms one converging sequence into another with the goal of accelerating the rate of convergence. However, analytically finding such a transformation for Monte Carlo estimates can be challenging, due to both the stochastic nature of the sequence, and the complexity of the integrand. In this paper, we propose to leverage neural networks to learn sequence transformations that improve the convergence of the progressive estimates of Monte Carlo integration. We demonstrate the effectiveness of our method on several canonical 1D integration problems as well as applications in light transport simulation.</abstract><cop>Oxford</cop><pub>Blackwell Publishing Ltd</pub><doi>10.1111/cgf.14407</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0003-2185-5545</orcidid><orcidid>https://orcid.org/0000-0002-3149-1442</orcidid><orcidid>https://orcid.org/0000-0002-5706-8634</orcidid><orcidid>https://orcid.org/0000-0001-8838-0455</orcidid><orcidid>https://orcid.org/0000-0003-1251-970X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0167-7055
ispartof Computer graphics forum, 2021-10, Vol.40 (7), p.131-140
issn 0167-7055
1467-8659
language eng
recordid cdi_proquest_journals_2602939252
source Business Source Complete; Wiley Online Library All Journals
subjects CCS Concepts
Computing methodologies → Machine learning algorithms
Convergence
Estimates
Mathematics of computing → Numerical analysis
Neural networks
Probability and statistics
Ray tracing
Transformations
title Neural Sequence Transformation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T18%3A35%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Neural%20Sequence%20Transformation&rft.jtitle=Computer%20graphics%20forum&rft.au=Mukherjee,%20Sabyasachi&rft.date=2021-10&rft.volume=40&rft.issue=7&rft.spage=131&rft.epage=140&rft.pages=131-140&rft.issn=0167-7055&rft.eissn=1467-8659&rft_id=info:doi/10.1111/cgf.14407&rft_dat=%3Cproquest_cross%3E2602939252%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2602939252&rft_id=info:pmid/&rfr_iscdi=true