Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions
We consider Bayesian inference problems with computationally intensive likelihood functions. We propose a Gaussian process (GP)–based method to approximate the joint distribution of the unknown parameters and the data, built on recent work (Kandasamy, Schneider, & Póczos, ). In particular, we wr...
Gespeichert in:
Veröffentlicht in: | Neural computation 2018-11, Vol.30 (11), p.3072-3094 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 3094 |
---|---|
container_issue | 11 |
container_start_page | 3072 |
container_title | Neural computation |
container_volume | 30 |
creator | Wang, Hongqiao Li, Jinglai |
description | We consider Bayesian inference problems with computationally intensive likelihood functions. We propose a Gaussian process (GP)–based method to approximate the joint distribution of the unknown parameters and the data, built on recent work (Kandasamy, Schneider, & Póczos,
). In particular, we write the joint density approximately as a product of an approximate posterior density and an exponentiated GP surrogate. We then provide an adaptive algorithm to construct such an approximation, where an active learning method is used to choose the design points. With numerical examples, we illustrate that the proposed method has competitive performance against existing approaches for Bayesian computation. |
doi_str_mv | 10.1162/neco_a_01127 |
format | Article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmed_primary_30216145</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2325093128</sourcerecordid><originalsourceid>FETCH-LOGICAL-c448t-30f52383cbf7b8503b42076c7ca2939b4c2394b5cb168138b257d91fffa4553d3</originalsourceid><addsrcrecordid>eNp1kD1vFDEQhi0EIkego0Yr0VCwMOOPXbs8oiREOgkKkOgs22srDnf2Yu-GhF_PHgkQEFTTPPPMOy8hTxFeIXb0dfIua6MBkfb3yAoFg1ZK-ek-WYFUqu27rj8gj2q9AIAOQTwkBwwodsjFitj1YMYpXvrm1My1RpOa9yU7X2uzHseSr-LOTDGnJuTSvDHX_gdyloIvPjnffI3TeXN8NfpU95JN_Oy38TznoTmZk9tv1sfkQTDb6p_czkPy8eT4w9HbdvPu9OxovWkd53JqGQRBmWTOht5KAcxyCn3nemeoYspyR5niVjiLnUQmLRX9oDCEYLgQbGCH5MWNd4n9ZfZ10rtYnd9uTfJ5rpouvwPHXokFff4XepHnkpZ0mjIqQDGkcqFe3lCu5FqLD3osSx3lWiPofff6bvcL_uxWOtudH37BP8v-HXAX7xz8j2v9D3SPXDKIiJoxwaXSFCjTIDRI_S2Ofzq-A2Uuomk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2325093128</pqid></control><display><type>article</type><title>Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions</title><source>MIT Press Journals</source><creator>Wang, Hongqiao ; Li, Jinglai</creator><creatorcontrib>Wang, Hongqiao ; Li, Jinglai</creatorcontrib><description>We consider Bayesian inference problems with computationally intensive likelihood functions. We propose a Gaussian process (GP)–based method to approximate the joint distribution of the unknown parameters and the data, built on recent work (Kandasamy, Schneider, & Póczos,
). In particular, we write the joint density approximately as a product of an approximate posterior density and an exponentiated GP surrogate. We then provide an adaptive algorithm to construct such an approximation, where an active learning method is used to choose the design points. With numerical examples, we illustrate that the proposed method has competitive performance against existing approaches for Bayesian computation.</description><identifier>ISSN: 0899-7667</identifier><identifier>EISSN: 1530-888X</identifier><identifier>DOI: 10.1162/neco_a_01127</identifier><identifier>PMID: 30216145</identifier><language>eng</language><publisher>One Rogers Street, Cambridge, MA 02142-1209, USA: MIT Press</publisher><subject>Adaptive algorithms ; Approximation ; Bayesian analysis ; Density ; Gaussian process ; Machine learning ; Statistical inference</subject><ispartof>Neural computation, 2018-11, Vol.30 (11), p.3072-3094</ispartof><rights>Copyright MIT Press Journals, The Nov 2018</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c448t-30f52383cbf7b8503b42076c7ca2939b4c2394b5cb168138b257d91fffa4553d3</citedby><cites>FETCH-LOGICAL-c448t-30f52383cbf7b8503b42076c7ca2939b4c2394b5cb168138b257d91fffa4553d3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://direct.mit.edu/neco/article/doi/10.1162/neco_a_01127$$EHTML$$P50$$Gmit$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54009,54010</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30216145$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wang, Hongqiao</creatorcontrib><creatorcontrib>Li, Jinglai</creatorcontrib><title>Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions</title><title>Neural computation</title><addtitle>Neural Comput</addtitle><description>We consider Bayesian inference problems with computationally intensive likelihood functions. We propose a Gaussian process (GP)–based method to approximate the joint distribution of the unknown parameters and the data, built on recent work (Kandasamy, Schneider, & Póczos,
). In particular, we write the joint density approximately as a product of an approximate posterior density and an exponentiated GP surrogate. We then provide an adaptive algorithm to construct such an approximation, where an active learning method is used to choose the design points. With numerical examples, we illustrate that the proposed method has competitive performance against existing approaches for Bayesian computation.</description><subject>Adaptive algorithms</subject><subject>Approximation</subject><subject>Bayesian analysis</subject><subject>Density</subject><subject>Gaussian process</subject><subject>Machine learning</subject><subject>Statistical inference</subject><issn>0899-7667</issn><issn>1530-888X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><recordid>eNp1kD1vFDEQhi0EIkego0Yr0VCwMOOPXbs8oiREOgkKkOgs22srDnf2Yu-GhF_PHgkQEFTTPPPMOy8hTxFeIXb0dfIua6MBkfb3yAoFg1ZK-ek-WYFUqu27rj8gj2q9AIAOQTwkBwwodsjFitj1YMYpXvrm1My1RpOa9yU7X2uzHseSr-LOTDGnJuTSvDHX_gdyloIvPjnffI3TeXN8NfpU95JN_Oy38TznoTmZk9tv1sfkQTDb6p_czkPy8eT4w9HbdvPu9OxovWkd53JqGQRBmWTOht5KAcxyCn3nemeoYspyR5niVjiLnUQmLRX9oDCEYLgQbGCH5MWNd4n9ZfZ10rtYnd9uTfJ5rpouvwPHXokFff4XepHnkpZ0mjIqQDGkcqFe3lCu5FqLD3osSx3lWiPofff6bvcL_uxWOtudH37BP8v-HXAX7xz8j2v9D3SPXDKIiJoxwaXSFCjTIDRI_S2Ofzq-A2Uuomk</recordid><startdate>20181101</startdate><enddate>20181101</enddate><creator>Wang, Hongqiao</creator><creator>Li, Jinglai</creator><general>MIT Press</general><general>MIT Press Journals, The</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope></search><sort><creationdate>20181101</creationdate><title>Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions</title><author>Wang, Hongqiao ; Li, Jinglai</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c448t-30f52383cbf7b8503b42076c7ca2939b4c2394b5cb168138b257d91fffa4553d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Adaptive algorithms</topic><topic>Approximation</topic><topic>Bayesian analysis</topic><topic>Density</topic><topic>Gaussian process</topic><topic>Machine learning</topic><topic>Statistical inference</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Hongqiao</creatorcontrib><creatorcontrib>Li, Jinglai</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>Neural computation</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Hongqiao</au><au>Li, Jinglai</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions</atitle><jtitle>Neural computation</jtitle><addtitle>Neural Comput</addtitle><date>2018-11-01</date><risdate>2018</risdate><volume>30</volume><issue>11</issue><spage>3072</spage><epage>3094</epage><pages>3072-3094</pages><issn>0899-7667</issn><eissn>1530-888X</eissn><abstract>We consider Bayesian inference problems with computationally intensive likelihood functions. We propose a Gaussian process (GP)–based method to approximate the joint distribution of the unknown parameters and the data, built on recent work (Kandasamy, Schneider, & Póczos,
). In particular, we write the joint density approximately as a product of an approximate posterior density and an exponentiated GP surrogate. We then provide an adaptive algorithm to construct such an approximation, where an active learning method is used to choose the design points. With numerical examples, we illustrate that the proposed method has competitive performance against existing approaches for Bayesian computation.</abstract><cop>One Rogers Street, Cambridge, MA 02142-1209, USA</cop><pub>MIT Press</pub><pmid>30216145</pmid><doi>10.1162/neco_a_01127</doi><tpages>23</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0899-7667 |
ispartof | Neural computation, 2018-11, Vol.30 (11), p.3072-3094 |
issn | 0899-7667 1530-888X |
language | eng |
recordid | cdi_pubmed_primary_30216145 |
source | MIT Press Journals |
subjects | Adaptive algorithms Approximation Bayesian analysis Density Gaussian process Machine learning Statistical inference |
title | Adaptive Gaussian Process Approximation for Bayesian Inference with Expensive Likelihood Functions |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T02%3A22%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Adaptive%20Gaussian%20Process%20Approximation%20for%20Bayesian%20Inference%20with%20Expensive%20Likelihood%20Functions&rft.jtitle=Neural%20computation&rft.au=Wang,%20Hongqiao&rft.date=2018-11-01&rft.volume=30&rft.issue=11&rft.spage=3072&rft.epage=3094&rft.pages=3072-3094&rft.issn=0899-7667&rft.eissn=1530-888X&rft_id=info:doi/10.1162/neco_a_01127&rft_dat=%3Cproquest_pubme%3E2325093128%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2325093128&rft_id=info:pmid/30216145&rfr_iscdi=true |