Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks
We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. H...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2020-04 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Geist, Moritz Petersen, Philipp Raslan, Mones Schneider, Reinhold Kutyniok, Gitta |
description | We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation. We use various methods to establish comparability between test-cases by minimizing the effect of the choice of test-cases on the optimization and sampling aspects of the learning problem. We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2395429302</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2395429302</sourcerecordid><originalsourceid>FETCH-proquest_journals_23954293023</originalsourceid><addsrcrecordid>eNqNy80KgkAUhuEhCJLyHg60FqYzWrlOo5UFtZcpzpCmjs4P0d1n0QW0-hbP-01YgEKsom2MOGOhtTXnHNcbTBIRsGPhWzLVTTZw1o13le5AK3B3gpM0siU3ImSVUt5-LB-8_EbXF2REPRTkzXguyD21edgFmyrZWAp_O2fLfX7ZHaLe6MGTdWWtvelGKlGkSYyp4Cj-q94ZXT6H</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2395429302</pqid></control><display><type>article</type><title>Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks</title><source>Free E- Journals</source><creator>Geist, Moritz ; Petersen, Philipp ; Raslan, Mones ; Schneider, Reinhold ; Kutyniok, Gitta</creator><creatorcontrib>Geist, Moritz ; Petersen, Philipp ; Raslan, Mones ; Schneider, Reinhold ; Kutyniok, Gitta</creatorcontrib><description>We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation. We use various methods to establish comparability between test-cases by minimizing the effect of the choice of test-cases on the optimization and sampling aspects of the learning problem. We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Approximation ; Artificial neural networks ; Machine learning ; Neural networks ; Numerical analysis ; Optimization ; Partial differential equations</subject><ispartof>arXiv.org, 2020-04</ispartof><rights>2020. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Geist, Moritz</creatorcontrib><creatorcontrib>Petersen, Philipp</creatorcontrib><creatorcontrib>Raslan, Mones</creatorcontrib><creatorcontrib>Schneider, Reinhold</creatorcontrib><creatorcontrib>Kutyniok, Gitta</creatorcontrib><title>Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks</title><title>arXiv.org</title><description>We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation. We use various methods to establish comparability between test-cases by minimizing the effect of the choice of test-cases on the optimization and sampling aspects of the learning problem. We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis.</description><subject>Approximation</subject><subject>Artificial neural networks</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Numerical analysis</subject><subject>Optimization</subject><subject>Partial differential equations</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNy80KgkAUhuEhCJLyHg60FqYzWrlOo5UFtZcpzpCmjs4P0d1n0QW0-hbP-01YgEKsom2MOGOhtTXnHNcbTBIRsGPhWzLVTTZw1o13le5AK3B3gpM0siU3ImSVUt5-LB-8_EbXF2REPRTkzXguyD21edgFmyrZWAp_O2fLfX7ZHaLe6MGTdWWtvelGKlGkSYyp4Cj-q94ZXT6H</recordid><startdate>20200425</startdate><enddate>20200425</enddate><creator>Geist, Moritz</creator><creator>Petersen, Philipp</creator><creator>Raslan, Mones</creator><creator>Schneider, Reinhold</creator><creator>Kutyniok, Gitta</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20200425</creationdate><title>Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks</title><author>Geist, Moritz ; Petersen, Philipp ; Raslan, Mones ; Schneider, Reinhold ; Kutyniok, Gitta</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_23954293023</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Approximation</topic><topic>Artificial neural networks</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Numerical analysis</topic><topic>Optimization</topic><topic>Partial differential equations</topic><toplevel>online_resources</toplevel><creatorcontrib>Geist, Moritz</creatorcontrib><creatorcontrib>Petersen, Philipp</creatorcontrib><creatorcontrib>Raslan, Mones</creatorcontrib><creatorcontrib>Schneider, Reinhold</creatorcontrib><creatorcontrib>Kutyniok, Gitta</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Geist, Moritz</au><au>Petersen, Philipp</au><au>Raslan, Mones</au><au>Schneider, Reinhold</au><au>Kutyniok, Gitta</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks</atitle><jtitle>arXiv.org</jtitle><date>2020-04-25</date><risdate>2020</risdate><eissn>2331-8422</eissn><abstract>We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation. We use various methods to establish comparability between test-cases by minimizing the effect of the choice of test-cases on the optimization and sampling aspects of the learning problem. We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2020-04 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2395429302 |
source | Free E- Journals |
subjects | Approximation Artificial neural networks Machine learning Neural networks Numerical analysis Optimization Partial differential equations |
title | Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T08%3A35%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Numerical%20Solution%20of%20the%20Parametric%20Diffusion%20Equation%20by%20Deep%20Neural%20Networks&rft.jtitle=arXiv.org&rft.au=Geist,%20Moritz&rft.date=2020-04-25&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2395429302%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2395429302&rft_id=info:pmid/&rfr_iscdi=true |