Variational Inference for Bayesian Bridge Regression

We study the implementation of Automatic Differentiation Variational inference (ADVI) for Bayesian inference on regression models with bridge penalization. The bridge approach uses $\ell_{\alpha}$ norm, with $\alpha \in (0, +\infty)$ to define a penalization on large values of the regression coeffic...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zanini, Carlos Tadeu Pagani, Migon, Helio dos Santos, Dias, Ronaldo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Zanini, Carlos Tadeu Pagani
Migon, Helio dos Santos
Dias, Ronaldo
description We study the implementation of Automatic Differentiation Variational inference (ADVI) for Bayesian inference on regression models with bridge penalization. The bridge approach uses $\ell_{\alpha}$ norm, with $\alpha \in (0, +\infty)$ to define a penalization on large values of the regression coefficients, which includes the Lasso ($\alpha = 1$) and ridge $(\alpha = 2)$ penalizations as special cases. Full Bayesian inference seamlessly provides joint uncertainty estimates for all model parameters. Although MCMC aproaches are available for bridge regression, it can be slow for large dataset, specially in high dimensions. The ADVI implementation allows the use of small batches of data at each iteration (due to stochastic gradient based algorithms), therefore speeding up computational time in comparison with MCMC. We illustrate the approach on non-parametric regression models with B-splines, although the method works seamlessly for other choices of basis functions. A simulation study shows the main properties of the proposed method.
doi_str_mv 10.48550/arxiv.2205.09515
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2205_09515</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2205_09515</sourcerecordid><originalsourceid>FETCH-LOGICAL-a675-9c55089f84ed1cc90306e92abd441dad084b3999e5e442b1d407585c6dd0593f3</originalsourceid><addsrcrecordid>eNotzkFPAjEQBeBePBj0B3iyf2CX6XZmaY9CFElITAzxupltp6QJLqRrjPx7EDi9y8t7n1JPBmp0RDDl8pd_66YBqsGToXuFX1wy_-T9wDu9GpIUGYLotC96zkcZMw96XnLciv6UbZFxPFcf1F3i3SiPt5yozdvrZvFerT-Wq8XLuuJ2RpUP50vnk0OJJgQPFlrxDfcR0USO4LC33nshQWx6ExFm5Ci0MQJ5m-xEPV9nL-zuUPI3l2P3z-8ufHsCRqA-8w</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Variational Inference for Bayesian Bridge Regression</title><source>arXiv.org</source><creator>Zanini, Carlos Tadeu Pagani ; Migon, Helio dos Santos ; Dias, Ronaldo</creator><creatorcontrib>Zanini, Carlos Tadeu Pagani ; Migon, Helio dos Santos ; Dias, Ronaldo</creatorcontrib><description>We study the implementation of Automatic Differentiation Variational inference (ADVI) for Bayesian inference on regression models with bridge penalization. The bridge approach uses $\ell_{\alpha}$ norm, with $\alpha \in (0, +\infty)$ to define a penalization on large values of the regression coefficients, which includes the Lasso ($\alpha = 1$) and ridge $(\alpha = 2)$ penalizations as special cases. Full Bayesian inference seamlessly provides joint uncertainty estimates for all model parameters. Although MCMC aproaches are available for bridge regression, it can be slow for large dataset, specially in high dimensions. The ADVI implementation allows the use of small batches of data at each iteration (due to stochastic gradient based algorithms), therefore speeding up computational time in comparison with MCMC. We illustrate the approach on non-parametric regression models with B-splines, although the method works seamlessly for other choices of basis functions. A simulation study shows the main properties of the proposed method.</description><identifier>DOI: 10.48550/arxiv.2205.09515</identifier><language>eng</language><subject>Computer Science - Learning ; Statistics - Computation ; Statistics - Machine Learning ; Statistics - Methodology</subject><creationdate>2022-05</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2205.09515$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2205.09515$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Zanini, Carlos Tadeu Pagani</creatorcontrib><creatorcontrib>Migon, Helio dos Santos</creatorcontrib><creatorcontrib>Dias, Ronaldo</creatorcontrib><title>Variational Inference for Bayesian Bridge Regression</title><description>We study the implementation of Automatic Differentiation Variational inference (ADVI) for Bayesian inference on regression models with bridge penalization. The bridge approach uses $\ell_{\alpha}$ norm, with $\alpha \in (0, +\infty)$ to define a penalization on large values of the regression coefficients, which includes the Lasso ($\alpha = 1$) and ridge $(\alpha = 2)$ penalizations as special cases. Full Bayesian inference seamlessly provides joint uncertainty estimates for all model parameters. Although MCMC aproaches are available for bridge regression, it can be slow for large dataset, specially in high dimensions. The ADVI implementation allows the use of small batches of data at each iteration (due to stochastic gradient based algorithms), therefore speeding up computational time in comparison with MCMC. We illustrate the approach on non-parametric regression models with B-splines, although the method works seamlessly for other choices of basis functions. A simulation study shows the main properties of the proposed method.</description><subject>Computer Science - Learning</subject><subject>Statistics - Computation</subject><subject>Statistics - Machine Learning</subject><subject>Statistics - Methodology</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzkFPAjEQBeBePBj0B3iyf2CX6XZmaY9CFElITAzxupltp6QJLqRrjPx7EDi9y8t7n1JPBmp0RDDl8pd_66YBqsGToXuFX1wy_-T9wDu9GpIUGYLotC96zkcZMw96XnLciv6UbZFxPFcf1F3i3SiPt5yozdvrZvFerT-Wq8XLuuJ2RpUP50vnk0OJJgQPFlrxDfcR0USO4LC33nshQWx6ExFm5Ci0MQJ5m-xEPV9nL-zuUPI3l2P3z-8ufHsCRqA-8w</recordid><startdate>20220519</startdate><enddate>20220519</enddate><creator>Zanini, Carlos Tadeu Pagani</creator><creator>Migon, Helio dos Santos</creator><creator>Dias, Ronaldo</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20220519</creationdate><title>Variational Inference for Bayesian Bridge Regression</title><author>Zanini, Carlos Tadeu Pagani ; Migon, Helio dos Santos ; Dias, Ronaldo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a675-9c55089f84ed1cc90306e92abd441dad084b3999e5e442b1d407585c6dd0593f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Learning</topic><topic>Statistics - Computation</topic><topic>Statistics - Machine Learning</topic><topic>Statistics - Methodology</topic><toplevel>online_resources</toplevel><creatorcontrib>Zanini, Carlos Tadeu Pagani</creatorcontrib><creatorcontrib>Migon, Helio dos Santos</creatorcontrib><creatorcontrib>Dias, Ronaldo</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zanini, Carlos Tadeu Pagani</au><au>Migon, Helio dos Santos</au><au>Dias, Ronaldo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Variational Inference for Bayesian Bridge Regression</atitle><date>2022-05-19</date><risdate>2022</risdate><abstract>We study the implementation of Automatic Differentiation Variational inference (ADVI) for Bayesian inference on regression models with bridge penalization. The bridge approach uses $\ell_{\alpha}$ norm, with $\alpha \in (0, +\infty)$ to define a penalization on large values of the regression coefficients, which includes the Lasso ($\alpha = 1$) and ridge $(\alpha = 2)$ penalizations as special cases. Full Bayesian inference seamlessly provides joint uncertainty estimates for all model parameters. Although MCMC aproaches are available for bridge regression, it can be slow for large dataset, specially in high dimensions. The ADVI implementation allows the use of small batches of data at each iteration (due to stochastic gradient based algorithms), therefore speeding up computational time in comparison with MCMC. We illustrate the approach on non-parametric regression models with B-splines, although the method works seamlessly for other choices of basis functions. A simulation study shows the main properties of the proposed method.</abstract><doi>10.48550/arxiv.2205.09515</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2205.09515
ispartof
issn
language eng
recordid cdi_arxiv_primary_2205_09515
source arXiv.org
subjects Computer Science - Learning
Statistics - Computation
Statistics - Machine Learning
Statistics - Methodology
title Variational Inference for Bayesian Bridge Regression
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T12%3A57%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Variational%20Inference%20for%20Bayesian%20Bridge%20Regression&rft.au=Zanini,%20Carlos%20Tadeu%20Pagani&rft.date=2022-05-19&rft_id=info:doi/10.48550/arxiv.2205.09515&rft_dat=%3Carxiv_GOX%3E2205_09515%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true