Surrogate-assisted parallel tempering for Bayesian neural learning
Due to the need for robust uncertainty quantification, Bayesian neural learning has gained attention in the era of deep learning and big data. Markov Chain Monte-Carlo (MCMC) methods typically implement Bayesian inference which faces several challenges given a large number of parameters, complex and...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2020-05 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Chandra, Rohitash Jain, Konark Kapoor, Arpit Ashray Aman |
description | Due to the need for robust uncertainty quantification, Bayesian neural learning has gained attention in the era of deep learning and big data. Markov Chain Monte-Carlo (MCMC) methods typically implement Bayesian inference which faces several challenges given a large number of parameters, complex and multimodal posterior distributions, and computational complexity of large neural network models. Parallel tempering MCMC addresses some of these limitations given that they can sample multimodal posterior distributions and utilize high-performance computing. However, certain challenges remain given large neural network models and big data. Surrogate-assisted optimization features the estimation of an objective function for models which are computationally expensive. In this paper, we address the inefficiency of parallel tempering MCMC for large-scale problems by combining parallel computing features with surrogate assisted likelihood estimation that describes the plausibility of a model parameter value, given specific observed data. Hence, we present surrogate-assisted parallel tempering for Bayesian neural learning for simple to computationally expensive models. Our results demonstrate that the methodology significantly lowers the computational cost while maintaining quality in decision making with Bayesian neural networks. The method has applications for a Bayesian inversion and uncertainty quantification for a broad range of numerical models. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2136818147</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2136818147</sourcerecordid><originalsourceid>FETCH-proquest_journals_21368181473</originalsourceid><addsrcrecordid>eNqNyr0KwjAUQOEgCBbtOwScC03Sv7miuOteLnhbUtKk3psMvr0dfACnM3xnJzJtjCq6SuuDyJnnsix10-q6NpnoH4koTBCxAGbLEV9yBQLn0MmIy4pk_STHQLKHD7IFLz2mbZAOgfyGJ7EfwTHmvx7F-XZ9Xu7FSuGdkOMwh0R-o0Er03SqU1Vr_ru-G_k6hA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2136818147</pqid></control><display><type>article</type><title>Surrogate-assisted parallel tempering for Bayesian neural learning</title><source>Free E- Journals</source><creator>Chandra, Rohitash ; Jain, Konark ; Kapoor, Arpit ; Ashray Aman</creator><creatorcontrib>Chandra, Rohitash ; Jain, Konark ; Kapoor, Arpit ; Ashray Aman</creatorcontrib><description>Due to the need for robust uncertainty quantification, Bayesian neural learning has gained attention in the era of deep learning and big data. Markov Chain Monte-Carlo (MCMC) methods typically implement Bayesian inference which faces several challenges given a large number of parameters, complex and multimodal posterior distributions, and computational complexity of large neural network models. Parallel tempering MCMC addresses some of these limitations given that they can sample multimodal posterior distributions and utilize high-performance computing. However, certain challenges remain given large neural network models and big data. Surrogate-assisted optimization features the estimation of an objective function for models which are computationally expensive. In this paper, we address the inefficiency of parallel tempering MCMC for large-scale problems by combining parallel computing features with surrogate assisted likelihood estimation that describes the plausibility of a model parameter value, given specific observed data. Hence, we present surrogate-assisted parallel tempering for Bayesian neural learning for simple to computationally expensive models. Our results demonstrate that the methodology significantly lowers the computational cost while maintaining quality in decision making with Bayesian neural networks. The method has applications for a Bayesian inversion and uncertainty quantification for a broad range of numerical models.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Bayesian analysis ; Computation ; Computer simulation ; Data management ; Decision making ; Machine learning ; Markov analysis ; Markov chains ; Mathematical models ; Monte Carlo simulation ; Optimization ; Parameters ; Tempering</subject><ispartof>arXiv.org, 2020-05</ispartof><rights>2020. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Chandra, Rohitash</creatorcontrib><creatorcontrib>Jain, Konark</creatorcontrib><creatorcontrib>Kapoor, Arpit</creatorcontrib><creatorcontrib>Ashray Aman</creatorcontrib><title>Surrogate-assisted parallel tempering for Bayesian neural learning</title><title>arXiv.org</title><description>Due to the need for robust uncertainty quantification, Bayesian neural learning has gained attention in the era of deep learning and big data. Markov Chain Monte-Carlo (MCMC) methods typically implement Bayesian inference which faces several challenges given a large number of parameters, complex and multimodal posterior distributions, and computational complexity of large neural network models. Parallel tempering MCMC addresses some of these limitations given that they can sample multimodal posterior distributions and utilize high-performance computing. However, certain challenges remain given large neural network models and big data. Surrogate-assisted optimization features the estimation of an objective function for models which are computationally expensive. In this paper, we address the inefficiency of parallel tempering MCMC for large-scale problems by combining parallel computing features with surrogate assisted likelihood estimation that describes the plausibility of a model parameter value, given specific observed data. Hence, we present surrogate-assisted parallel tempering for Bayesian neural learning for simple to computationally expensive models. Our results demonstrate that the methodology significantly lowers the computational cost while maintaining quality in decision making with Bayesian neural networks. The method has applications for a Bayesian inversion and uncertainty quantification for a broad range of numerical models.</description><subject>Bayesian analysis</subject><subject>Computation</subject><subject>Computer simulation</subject><subject>Data management</subject><subject>Decision making</subject><subject>Machine learning</subject><subject>Markov analysis</subject><subject>Markov chains</subject><subject>Mathematical models</subject><subject>Monte Carlo simulation</subject><subject>Optimization</subject><subject>Parameters</subject><subject>Tempering</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNyr0KwjAUQOEgCBbtOwScC03Sv7miuOteLnhbUtKk3psMvr0dfACnM3xnJzJtjCq6SuuDyJnnsix10-q6NpnoH4koTBCxAGbLEV9yBQLn0MmIy4pk_STHQLKHD7IFLz2mbZAOgfyGJ7EfwTHmvx7F-XZ9Xu7FSuGdkOMwh0R-o0Er03SqU1Vr_ru-G_k6hA</recordid><startdate>20200514</startdate><enddate>20200514</enddate><creator>Chandra, Rohitash</creator><creator>Jain, Konark</creator><creator>Kapoor, Arpit</creator><creator>Ashray Aman</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope></search><sort><creationdate>20200514</creationdate><title>Surrogate-assisted parallel tempering for Bayesian neural learning</title><author>Chandra, Rohitash ; Jain, Konark ; Kapoor, Arpit ; Ashray Aman</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_21368181473</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Bayesian analysis</topic><topic>Computation</topic><topic>Computer simulation</topic><topic>Data management</topic><topic>Decision making</topic><topic>Machine learning</topic><topic>Markov analysis</topic><topic>Markov chains</topic><topic>Mathematical models</topic><topic>Monte Carlo simulation</topic><topic>Optimization</topic><topic>Parameters</topic><topic>Tempering</topic><toplevel>online_resources</toplevel><creatorcontrib>Chandra, Rohitash</creatorcontrib><creatorcontrib>Jain, Konark</creatorcontrib><creatorcontrib>Kapoor, Arpit</creatorcontrib><creatorcontrib>Ashray Aman</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chandra, Rohitash</au><au>Jain, Konark</au><au>Kapoor, Arpit</au><au>Ashray Aman</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Surrogate-assisted parallel tempering for Bayesian neural learning</atitle><jtitle>arXiv.org</jtitle><date>2020-05-14</date><risdate>2020</risdate><eissn>2331-8422</eissn><abstract>Due to the need for robust uncertainty quantification, Bayesian neural learning has gained attention in the era of deep learning and big data. Markov Chain Monte-Carlo (MCMC) methods typically implement Bayesian inference which faces several challenges given a large number of parameters, complex and multimodal posterior distributions, and computational complexity of large neural network models. Parallel tempering MCMC addresses some of these limitations given that they can sample multimodal posterior distributions and utilize high-performance computing. However, certain challenges remain given large neural network models and big data. Surrogate-assisted optimization features the estimation of an objective function for models which are computationally expensive. In this paper, we address the inefficiency of parallel tempering MCMC for large-scale problems by combining parallel computing features with surrogate assisted likelihood estimation that describes the plausibility of a model parameter value, given specific observed data. Hence, we present surrogate-assisted parallel tempering for Bayesian neural learning for simple to computationally expensive models. Our results demonstrate that the methodology significantly lowers the computational cost while maintaining quality in decision making with Bayesian neural networks. The method has applications for a Bayesian inversion and uncertainty quantification for a broad range of numerical models.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2020-05 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2136818147 |
source | Free E- Journals |
subjects | Bayesian analysis Computation Computer simulation Data management Decision making Machine learning Markov analysis Markov chains Mathematical models Monte Carlo simulation Optimization Parameters Tempering |
title | Surrogate-assisted parallel tempering for Bayesian neural learning |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T10%3A45%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Surrogate-assisted%20parallel%20tempering%20for%20Bayesian%20neural%20learning&rft.jtitle=arXiv.org&rft.au=Chandra,%20Rohitash&rft.date=2020-05-14&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2136818147%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2136818147&rft_id=info:pmid/&rfr_iscdi=true |