Weighted Bayesian bootstrap for scalable posterior distributions

We introduce and develop a weighted Bayesian bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-the-shelf optimization software. First-order a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Canadian journal of statistics 2021-06, Vol.49 (2), p.421-437
Hauptverfasser: NEWTON, Michael A., POLSON, Nicholas G., XU, Jianeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 437
container_issue 2
container_start_page 421
container_title Canadian journal of statistics
container_volume 49
creator NEWTON, Michael A.
POLSON, Nicholas G.
XU, Jianeng
description We introduce and develop a weighted Bayesian bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-the-shelf optimization software. First-order asymptotic analysis provides a theoretical justification under suitable regularity conditions on the statistical model. We illustrate the proposed methodology in regularized regression, trend filtering and deep learning and conclude with directions for future research. Les auteurs développent un bootstrap pondéré bayésien (BPB) pour l’apprentissage machine et la statistique. Le BPB offre une quantification de l’incertitude en échantillonnant à partir d’une loi a posteriori en haute dimension. Il est rapide à calculer et peut être mis à l’échelle en utilisant uniquement des logiciels d’optimisation prêts à l’emploi. Les auteurs justifient la méthode théoriquement à l’aide d’une analyse asymptotique du premier ordre sous des conditions de régularité du modèle statistique. Ils illustrent la méthodologie proposée avec des modèles de régression régularisée, de filtrage des tendances, et d’apprentissage profond. Ils concluent par des pistes de recherche.
doi_str_mv 10.1002/cjs.11570
format Article
fullrecord <record><control><sourceid>jstor_proqu</sourceid><recordid>TN_cdi_proquest_journals_2533099258</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><jstor_id>48762907</jstor_id><sourcerecordid>48762907</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3540-b079876fd3b09a3805831782ecbd1a7af4e8f3348252e535eee17c7b4feccbd93</originalsourceid><addsrcrecordid>eNp1kE9LxDAQxYMouK4e_ABCwZOHrpOkMc1NXfzLggcVvYWknWpK3dSki-y3N1r15mlg5vfe8B4h-xRmFIAdV22cUSokbJAJlVDmqhDPm2QCnKpcSFZsk50YWwAuKGUTcvqE7uV1wDo7N2uMziwz6_0Qh2D6rPEhi5XpjO0w630cMLi0ql06O7sanF_GXbLVmC7i3s-cksfLi4f5db64u7qZny3yiosCcgtSlfKkqbkFZXgJouRUlgwrW1MjTVNg2XBelEwwFFwgIpWVtEWDVUIUn5LD0bcP_n2FcdCtX4VleqmZ4ByUYslySo5Gqgo-xoCN7oN7M2GtKeivgnQqSH8XlNjjkf1wHa7_B_X89v5XcTAq2jj48KcoUjCmQPJPH-RxSQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2533099258</pqid></control><display><type>article</type><title>Weighted Bayesian bootstrap for scalable posterior distributions</title><source>Wiley Online Library All Journals</source><creator>NEWTON, Michael A. ; POLSON, Nicholas G. ; XU, Jianeng</creator><creatorcontrib>NEWTON, Michael A. ; POLSON, Nicholas G. ; XU, Jianeng</creatorcontrib><description>We introduce and develop a weighted Bayesian bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-the-shelf optimization software. First-order asymptotic analysis provides a theoretical justification under suitable regularity conditions on the statistical model. We illustrate the proposed methodology in regularized regression, trend filtering and deep learning and conclude with directions for future research. Les auteurs développent un bootstrap pondéré bayésien (BPB) pour l’apprentissage machine et la statistique. Le BPB offre une quantification de l’incertitude en échantillonnant à partir d’une loi a posteriori en haute dimension. Il est rapide à calculer et peut être mis à l’échelle en utilisant uniquement des logiciels d’optimisation prêts à l’emploi. Les auteurs justifient la méthode théoriquement à l’aide d’une analyse asymptotique du premier ordre sous des conditions de régularité du modèle statistique. Ils illustrent la méthodologie proposée avec des modèles de régression régularisée, de filtrage des tendances, et d’apprentissage profond. Ils concluent par des pistes de recherche.</description><identifier>ISSN: 0319-5724</identifier><identifier>EISSN: 1708-945X</identifier><identifier>DOI: 10.1002/cjs.11570</identifier><language>eng</language><publisher>Hoboken, USA: Wiley</publisher><subject>Asymptotic methods ; Bayesian analysis ; Bootstrap method ; Deep learning ; Justification ; Machine learning ; Markov chain Monte Carlo ; Measurement ; Optimization ; Regression analysis ; regularization ; Statistical analysis ; Statistical methods ; Statistical models ; trend filtering ; Uncertainty ; weighted bootstrap</subject><ispartof>Canadian journal of statistics, 2021-06, Vol.49 (2), p.421-437</ispartof><rights>2020 The Authors</rights><rights>2020 The Authors. The Canadian Journal of Statistics/La revue canadienne de statistique published by Wiley Periodicals LLC. on behalf of Statistical Society of Canada</rights><rights>2020. This article is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3540-b079876fd3b09a3805831782ecbd1a7af4e8f3348252e535eee17c7b4feccbd93</citedby><cites>FETCH-LOGICAL-c3540-b079876fd3b09a3805831782ecbd1a7af4e8f3348252e535eee17c7b4feccbd93</cites><orcidid>0000-0001-9038-878X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fcjs.11570$$EPDF$$P50$$Gwiley$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fcjs.11570$$EHTML$$P50$$Gwiley$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,1417,27923,27924,45573,45574</link.rule.ids></links><search><creatorcontrib>NEWTON, Michael A.</creatorcontrib><creatorcontrib>POLSON, Nicholas G.</creatorcontrib><creatorcontrib>XU, Jianeng</creatorcontrib><title>Weighted Bayesian bootstrap for scalable posterior distributions</title><title>Canadian journal of statistics</title><description>We introduce and develop a weighted Bayesian bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-the-shelf optimization software. First-order asymptotic analysis provides a theoretical justification under suitable regularity conditions on the statistical model. We illustrate the proposed methodology in regularized regression, trend filtering and deep learning and conclude with directions for future research. Les auteurs développent un bootstrap pondéré bayésien (BPB) pour l’apprentissage machine et la statistique. Le BPB offre une quantification de l’incertitude en échantillonnant à partir d’une loi a posteriori en haute dimension. Il est rapide à calculer et peut être mis à l’échelle en utilisant uniquement des logiciels d’optimisation prêts à l’emploi. Les auteurs justifient la méthode théoriquement à l’aide d’une analyse asymptotique du premier ordre sous des conditions de régularité du modèle statistique. Ils illustrent la méthodologie proposée avec des modèles de régression régularisée, de filtrage des tendances, et d’apprentissage profond. Ils concluent par des pistes de recherche.</description><subject>Asymptotic methods</subject><subject>Bayesian analysis</subject><subject>Bootstrap method</subject><subject>Deep learning</subject><subject>Justification</subject><subject>Machine learning</subject><subject>Markov chain Monte Carlo</subject><subject>Measurement</subject><subject>Optimization</subject><subject>Regression analysis</subject><subject>regularization</subject><subject>Statistical analysis</subject><subject>Statistical methods</subject><subject>Statistical models</subject><subject>trend filtering</subject><subject>Uncertainty</subject><subject>weighted bootstrap</subject><issn>0319-5724</issn><issn>1708-945X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>WIN</sourceid><recordid>eNp1kE9LxDAQxYMouK4e_ABCwZOHrpOkMc1NXfzLggcVvYWknWpK3dSki-y3N1r15mlg5vfe8B4h-xRmFIAdV22cUSokbJAJlVDmqhDPm2QCnKpcSFZsk50YWwAuKGUTcvqE7uV1wDo7N2uMziwz6_0Qh2D6rPEhi5XpjO0w630cMLi0ql06O7sanF_GXbLVmC7i3s-cksfLi4f5db64u7qZny3yiosCcgtSlfKkqbkFZXgJouRUlgwrW1MjTVNg2XBelEwwFFwgIpWVtEWDVUIUn5LD0bcP_n2FcdCtX4VleqmZ4ByUYslySo5Gqgo-xoCN7oN7M2GtKeivgnQqSH8XlNjjkf1wHa7_B_X89v5XcTAq2jj48KcoUjCmQPJPH-RxSQ</recordid><startdate>20210601</startdate><enddate>20210601</enddate><creator>NEWTON, Michael A.</creator><creator>POLSON, Nicholas G.</creator><creator>XU, Jianeng</creator><general>Wiley</general><general>John Wiley &amp; Sons, Inc</general><general>Wiley Subscription Services, Inc</general><scope>24P</scope><scope>WIN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8BJ</scope><scope>8FD</scope><scope>FQK</scope><scope>H8D</scope><scope>JBE</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-9038-878X</orcidid></search><sort><creationdate>20210601</creationdate><title>Weighted Bayesian bootstrap for scalable posterior distributions</title><author>NEWTON, Michael A. ; POLSON, Nicholas G. ; XU, Jianeng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3540-b079876fd3b09a3805831782ecbd1a7af4e8f3348252e535eee17c7b4feccbd93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Asymptotic methods</topic><topic>Bayesian analysis</topic><topic>Bootstrap method</topic><topic>Deep learning</topic><topic>Justification</topic><topic>Machine learning</topic><topic>Markov chain Monte Carlo</topic><topic>Measurement</topic><topic>Optimization</topic><topic>Regression analysis</topic><topic>regularization</topic><topic>Statistical analysis</topic><topic>Statistical methods</topic><topic>Statistical models</topic><topic>trend filtering</topic><topic>Uncertainty</topic><topic>weighted bootstrap</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>NEWTON, Michael A.</creatorcontrib><creatorcontrib>POLSON, Nicholas G.</creatorcontrib><creatorcontrib>XU, Jianeng</creatorcontrib><collection>Wiley-Blackwell Open Access Titles</collection><collection>Wiley Free Content</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>International Bibliography of the Social Sciences (IBSS)</collection><collection>Technology Research Database</collection><collection>International Bibliography of the Social Sciences</collection><collection>Aerospace Database</collection><collection>International Bibliography of the Social Sciences</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Canadian journal of statistics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>NEWTON, Michael A.</au><au>POLSON, Nicholas G.</au><au>XU, Jianeng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Weighted Bayesian bootstrap for scalable posterior distributions</atitle><jtitle>Canadian journal of statistics</jtitle><date>2021-06-01</date><risdate>2021</risdate><volume>49</volume><issue>2</issue><spage>421</spage><epage>437</epage><pages>421-437</pages><issn>0319-5724</issn><eissn>1708-945X</eissn><abstract>We introduce and develop a weighted Bayesian bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-the-shelf optimization software. First-order asymptotic analysis provides a theoretical justification under suitable regularity conditions on the statistical model. We illustrate the proposed methodology in regularized regression, trend filtering and deep learning and conclude with directions for future research. Les auteurs développent un bootstrap pondéré bayésien (BPB) pour l’apprentissage machine et la statistique. Le BPB offre une quantification de l’incertitude en échantillonnant à partir d’une loi a posteriori en haute dimension. Il est rapide à calculer et peut être mis à l’échelle en utilisant uniquement des logiciels d’optimisation prêts à l’emploi. Les auteurs justifient la méthode théoriquement à l’aide d’une analyse asymptotique du premier ordre sous des conditions de régularité du modèle statistique. Ils illustrent la méthodologie proposée avec des modèles de régression régularisée, de filtrage des tendances, et d’apprentissage profond. Ils concluent par des pistes de recherche.</abstract><cop>Hoboken, USA</cop><pub>Wiley</pub><doi>10.1002/cjs.11570</doi><tpages>17</tpages><orcidid>https://orcid.org/0000-0001-9038-878X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0319-5724
ispartof Canadian journal of statistics, 2021-06, Vol.49 (2), p.421-437
issn 0319-5724
1708-945X
language eng
recordid cdi_proquest_journals_2533099258
source Wiley Online Library All Journals
subjects Asymptotic methods
Bayesian analysis
Bootstrap method
Deep learning
Justification
Machine learning
Markov chain Monte Carlo
Measurement
Optimization
Regression analysis
regularization
Statistical analysis
Statistical methods
Statistical models
trend filtering
Uncertainty
weighted bootstrap
title Weighted Bayesian bootstrap for scalable posterior distributions
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T22%3A21%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Weighted%20Bayesian%20bootstrap%20for%20scalable%20posterior%20distributions&rft.jtitle=Canadian%20journal%20of%20statistics&rft.au=NEWTON,%20Michael%20A.&rft.date=2021-06-01&rft.volume=49&rft.issue=2&rft.spage=421&rft.epage=437&rft.pages=421-437&rft.issn=0319-5724&rft.eissn=1708-945X&rft_id=info:doi/10.1002/cjs.11570&rft_dat=%3Cjstor_proqu%3E48762907%3C/jstor_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2533099258&rft_id=info:pmid/&rft_jstor_id=48762907&rfr_iscdi=true