A stochastic version of Stein Variational Gradient Descent for efficient sampling

We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2019-04
Hauptverfasser: Li, Lei, Li, Yingzhou, Jian-Guo, Liu, Liu, Zibu, Lu, Jianfeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Li, Lei
Li, Yingzhou
Jian-Guo, Liu
Liu, Zibu
Lu, Jianfeng
description We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed by Jin et al to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. Numerical examples verify the efficiency of this new version of SVGD.
doi_str_mv 10.48550/arxiv.1902.03394
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_1902_03394</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2178840640</sourcerecordid><originalsourceid>FETCH-LOGICAL-a520-65704a7304b04edb82b8010ecbbf9c795da44ebcbac65f7e21bee5a5bb0cabd93</originalsourceid><addsrcrecordid>eNotj0tLw0AURgdBsNT-AFcOuE69mUcey1K1CgURi9twZ3JHp6RJnUmL_nvT1NWBw8cHh7GbFOaq0BruMfz44zwtQcxBylJdsImQMk0KJcQVm8W4BQCR5UJrOWFvCx77zn5h7L3lRwrRdy3vHH_vybf8A4PHflDY8FXA2lPb8weK9kTXBU7OeTvaiLt949vPa3bpsIk0--eUbZ4eN8vnZP26elku1glqAUmmc1CYS1AGFNWmEKaAFMga40qbl7pGpchYgzbTLieRGiKN2hiwaOpSTtnt-XbsrfbB7zD8VqfuauweFnfnxT503weKfbXtDmEoiZVI86JQkCmQf0aIXGc</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2178840640</pqid></control><display><type>article</type><title>A stochastic version of Stein Variational Gradient Descent for efficient sampling</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Li, Lei ; Li, Yingzhou ; Jian-Guo, Liu ; Liu, Zibu ; Lu, Jianfeng</creator><creatorcontrib>Li, Lei ; Li, Yingzhou ; Jian-Guo, Liu ; Liu, Zibu ; Lu, Jianfeng</creatorcontrib><description>We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed by Jin et al to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. Numerical examples verify the efficiency of this new version of SVGD.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.1902.03394</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Batch processing ; Bayesian analysis ; Computer Science - Learning ; Mathematics - Probability ; Probabilistic inference ; Sampling ; Statistical inference ; Statistics - Machine Learning</subject><ispartof>arXiv.org, 2019-04</ispartof><rights>2019. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,780,881,27902</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.1902.03394$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.2140/camcos.2020.15.37$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>Li, Lei</creatorcontrib><creatorcontrib>Li, Yingzhou</creatorcontrib><creatorcontrib>Jian-Guo, Liu</creatorcontrib><creatorcontrib>Liu, Zibu</creatorcontrib><creatorcontrib>Lu, Jianfeng</creatorcontrib><title>A stochastic version of Stein Variational Gradient Descent for efficient sampling</title><title>arXiv.org</title><description>We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed by Jin et al to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. Numerical examples verify the efficiency of this new version of SVGD.</description><subject>Batch processing</subject><subject>Bayesian analysis</subject><subject>Computer Science - Learning</subject><subject>Mathematics - Probability</subject><subject>Probabilistic inference</subject><subject>Sampling</subject><subject>Statistical inference</subject><subject>Statistics - Machine Learning</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><sourceid>GOX</sourceid><recordid>eNotj0tLw0AURgdBsNT-AFcOuE69mUcey1K1CgURi9twZ3JHp6RJnUmL_nvT1NWBw8cHh7GbFOaq0BruMfz44zwtQcxBylJdsImQMk0KJcQVm8W4BQCR5UJrOWFvCx77zn5h7L3lRwrRdy3vHH_vybf8A4PHflDY8FXA2lPb8weK9kTXBU7OeTvaiLt949vPa3bpsIk0--eUbZ4eN8vnZP26elku1glqAUmmc1CYS1AGFNWmEKaAFMga40qbl7pGpchYgzbTLieRGiKN2hiwaOpSTtnt-XbsrfbB7zD8VqfuauweFnfnxT503weKfbXtDmEoiZVI86JQkCmQf0aIXGc</recordid><startdate>20190411</startdate><enddate>20190411</enddate><creator>Li, Lei</creator><creator>Li, Yingzhou</creator><creator>Jian-Guo, Liu</creator><creator>Liu, Zibu</creator><creator>Lu, Jianfeng</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20190411</creationdate><title>A stochastic version of Stein Variational Gradient Descent for efficient sampling</title><author>Li, Lei ; Li, Yingzhou ; Jian-Guo, Liu ; Liu, Zibu ; Lu, Jianfeng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a520-65704a7304b04edb82b8010ecbbf9c795da44ebcbac65f7e21bee5a5bb0cabd93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Batch processing</topic><topic>Bayesian analysis</topic><topic>Computer Science - Learning</topic><topic>Mathematics - Probability</topic><topic>Probabilistic inference</topic><topic>Sampling</topic><topic>Statistical inference</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Li, Lei</creatorcontrib><creatorcontrib>Li, Yingzhou</creatorcontrib><creatorcontrib>Jian-Guo, Liu</creatorcontrib><creatorcontrib>Liu, Zibu</creatorcontrib><creatorcontrib>Lu, Jianfeng</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Lei</au><au>Li, Yingzhou</au><au>Jian-Guo, Liu</au><au>Liu, Zibu</au><au>Lu, Jianfeng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A stochastic version of Stein Variational Gradient Descent for efficient sampling</atitle><jtitle>arXiv.org</jtitle><date>2019-04-11</date><risdate>2019</risdate><eissn>2331-8422</eissn><abstract>We propose in this work RBM-SVGD, a stochastic version of Stein Variational Gradient Descent (SVGD) method for efficiently sampling from a given probability measure and thus useful for Bayesian inference. The method is to apply the Random Batch Method (RBM) for interacting particle systems proposed by Jin et al to the interacting particle systems in SVGD. While keeping the behaviors of SVGD, it reduces the computational cost, especially when the interacting kernel has long range. Numerical examples verify the efficiency of this new version of SVGD.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.1902.03394</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2019-04
issn 2331-8422
language eng
recordid cdi_arxiv_primary_1902_03394
source arXiv.org; Free E- Journals
subjects Batch processing
Bayesian analysis
Computer Science - Learning
Mathematics - Probability
Probabilistic inference
Sampling
Statistical inference
Statistics - Machine Learning
title A stochastic version of Stein Variational Gradient Descent for efficient sampling
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T05%3A16%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20stochastic%20version%20of%20Stein%20Variational%20Gradient%20Descent%20for%20efficient%20sampling&rft.jtitle=arXiv.org&rft.au=Li,%20Lei&rft.date=2019-04-11&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.1902.03394&rft_dat=%3Cproquest_arxiv%3E2178840640%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2178840640&rft_id=info:pmid/&rfr_iscdi=true