Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity
Simulation-based calibration checking (SBC) is a practical method to validate computationally-derived posterior distributions or their approximations. In this paper, we introduce a new variant of SBC to alleviate several known problems. Our variant allows the user to in principle detect any possible...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Modrák, Martin Moon, Angie H Kim, Shinyoung Bürkner, Paul Huurre, Niko Faltejsková, Kateřina Gelman, Andrew Vehtari, Aki |
description | Simulation-based calibration checking (SBC) is a practical method to validate
computationally-derived posterior distributions or their approximations. In
this paper, we introduce a new variant of SBC to alleviate several known
problems. Our variant allows the user to in principle detect any possible issue
with the posterior, while previously reported implementations could never
detect large classes of problems including when the posterior is equal to the
prior. This is made possible by including additional data-dependent test
quantities when running SBC. We argue and demonstrate that the joint likelihood
of the data is an especially useful test quantity. Some other types of test
quantities and their theoretical and practical benefits are also investigated.
We provide theoretical analysis of SBC, thereby providing a more complete
understanding of the underlying statistical mechanisms. We also bring attention
to a relatively common mistake in the literature and clarify the difference
between SBC and checks based on the data-averaged posterior. We support our
recommendations with numerical case studies on a multivariate normal example
and a case study in implementing an ordered simplex data type for use with
Hamiltonian Monte Carlo. The SBC variant introduced in this paper is
implemented in the $\mathtt{SBC}$ R package. |
doi_str_mv | 10.48550/arxiv.2211.02383 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2211_02383</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2211_02383</sourcerecordid><originalsourceid>FETCH-LOGICAL-a673-75aef5d31e15d9360017713d2d4690561560cbd1100040c6b3195193945c5b1b3</originalsourceid><addsrcrecordid>eNotj81OhDAURrtxYUYfwJV9AbCX0jK4c4h_ySTGDHtyoUVu5C-0TOTtZdDVSU6-fMlh7A5EGO-VEg84_dA5jCKAUERyL6_ZeKJubtHT0AcHdNbwDFsqp83wrLHVN_VfvB4mfsDFOsLVDt04-23xyPPGrrOBKsuHmufWef45Y-_Jk3X81OB4ge3dKs7klxt2VWPr7O0_dyx_ec6zt-D48fqePR0D1IkMEoW2VkaCBWVSqYWAJAFpIhPrVCgNSouqNABCiFhUupSQKkhlGqtKlVDKHbv_u92Si3GiDqeluKQXW7r8BYfgU9E</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity</title><source>arXiv.org</source><creator>Modrák, Martin ; Moon, Angie H ; Kim, Shinyoung ; Bürkner, Paul ; Huurre, Niko ; Faltejsková, Kateřina ; Gelman, Andrew ; Vehtari, Aki</creator><creatorcontrib>Modrák, Martin ; Moon, Angie H ; Kim, Shinyoung ; Bürkner, Paul ; Huurre, Niko ; Faltejsková, Kateřina ; Gelman, Andrew ; Vehtari, Aki</creatorcontrib><description>Simulation-based calibration checking (SBC) is a practical method to validate
computationally-derived posterior distributions or their approximations. In
this paper, we introduce a new variant of SBC to alleviate several known
problems. Our variant allows the user to in principle detect any possible issue
with the posterior, while previously reported implementations could never
detect large classes of problems including when the posterior is equal to the
prior. This is made possible by including additional data-dependent test
quantities when running SBC. We argue and demonstrate that the joint likelihood
of the data is an especially useful test quantity. Some other types of test
quantities and their theoretical and practical benefits are also investigated.
We provide theoretical analysis of SBC, thereby providing a more complete
understanding of the underlying statistical mechanisms. We also bring attention
to a relatively common mistake in the literature and clarify the difference
between SBC and checks based on the data-averaged posterior. We support our
recommendations with numerical case studies on a multivariate normal example
and a case study in implementing an ordered simplex data type for use with
Hamiltonian Monte Carlo. The SBC variant introduced in this paper is
implemented in the $\mathtt{SBC}$ R package.</description><identifier>DOI: 10.48550/arxiv.2211.02383</identifier><language>eng</language><subject>Statistics - Methodology</subject><creationdate>2022-11</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2211.02383$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.1214/23-BA1404$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.2211.02383$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Modrák, Martin</creatorcontrib><creatorcontrib>Moon, Angie H</creatorcontrib><creatorcontrib>Kim, Shinyoung</creatorcontrib><creatorcontrib>Bürkner, Paul</creatorcontrib><creatorcontrib>Huurre, Niko</creatorcontrib><creatorcontrib>Faltejsková, Kateřina</creatorcontrib><creatorcontrib>Gelman, Andrew</creatorcontrib><creatorcontrib>Vehtari, Aki</creatorcontrib><title>Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity</title><description>Simulation-based calibration checking (SBC) is a practical method to validate
computationally-derived posterior distributions or their approximations. In
this paper, we introduce a new variant of SBC to alleviate several known
problems. Our variant allows the user to in principle detect any possible issue
with the posterior, while previously reported implementations could never
detect large classes of problems including when the posterior is equal to the
prior. This is made possible by including additional data-dependent test
quantities when running SBC. We argue and demonstrate that the joint likelihood
of the data is an especially useful test quantity. Some other types of test
quantities and their theoretical and practical benefits are also investigated.
We provide theoretical analysis of SBC, thereby providing a more complete
understanding of the underlying statistical mechanisms. We also bring attention
to a relatively common mistake in the literature and clarify the difference
between SBC and checks based on the data-averaged posterior. We support our
recommendations with numerical case studies on a multivariate normal example
and a case study in implementing an ordered simplex data type for use with
Hamiltonian Monte Carlo. The SBC variant introduced in this paper is
implemented in the $\mathtt{SBC}$ R package.</description><subject>Statistics - Methodology</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj81OhDAURrtxYUYfwJV9AbCX0jK4c4h_ySTGDHtyoUVu5C-0TOTtZdDVSU6-fMlh7A5EGO-VEg84_dA5jCKAUERyL6_ZeKJubtHT0AcHdNbwDFsqp83wrLHVN_VfvB4mfsDFOsLVDt04-23xyPPGrrOBKsuHmufWef45Y-_Jk3X81OB4ge3dKs7klxt2VWPr7O0_dyx_ec6zt-D48fqePR0D1IkMEoW2VkaCBWVSqYWAJAFpIhPrVCgNSouqNABCiFhUupSQKkhlGqtKlVDKHbv_u92Si3GiDqeluKQXW7r8BYfgU9E</recordid><startdate>20221104</startdate><enddate>20221104</enddate><creator>Modrák, Martin</creator><creator>Moon, Angie H</creator><creator>Kim, Shinyoung</creator><creator>Bürkner, Paul</creator><creator>Huurre, Niko</creator><creator>Faltejsková, Kateřina</creator><creator>Gelman, Andrew</creator><creator>Vehtari, Aki</creator><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20221104</creationdate><title>Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity</title><author>Modrák, Martin ; Moon, Angie H ; Kim, Shinyoung ; Bürkner, Paul ; Huurre, Niko ; Faltejsková, Kateřina ; Gelman, Andrew ; Vehtari, Aki</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a673-75aef5d31e15d9360017713d2d4690561560cbd1100040c6b3195193945c5b1b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Statistics - Methodology</topic><toplevel>online_resources</toplevel><creatorcontrib>Modrák, Martin</creatorcontrib><creatorcontrib>Moon, Angie H</creatorcontrib><creatorcontrib>Kim, Shinyoung</creatorcontrib><creatorcontrib>Bürkner, Paul</creatorcontrib><creatorcontrib>Huurre, Niko</creatorcontrib><creatorcontrib>Faltejsková, Kateřina</creatorcontrib><creatorcontrib>Gelman, Andrew</creatorcontrib><creatorcontrib>Vehtari, Aki</creatorcontrib><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Modrák, Martin</au><au>Moon, Angie H</au><au>Kim, Shinyoung</au><au>Bürkner, Paul</au><au>Huurre, Niko</au><au>Faltejsková, Kateřina</au><au>Gelman, Andrew</au><au>Vehtari, Aki</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity</atitle><date>2022-11-04</date><risdate>2022</risdate><abstract>Simulation-based calibration checking (SBC) is a practical method to validate
computationally-derived posterior distributions or their approximations. In
this paper, we introduce a new variant of SBC to alleviate several known
problems. Our variant allows the user to in principle detect any possible issue
with the posterior, while previously reported implementations could never
detect large classes of problems including when the posterior is equal to the
prior. This is made possible by including additional data-dependent test
quantities when running SBC. We argue and demonstrate that the joint likelihood
of the data is an especially useful test quantity. Some other types of test
quantities and their theoretical and practical benefits are also investigated.
We provide theoretical analysis of SBC, thereby providing a more complete
understanding of the underlying statistical mechanisms. We also bring attention
to a relatively common mistake in the literature and clarify the difference
between SBC and checks based on the data-averaged posterior. We support our
recommendations with numerical case studies on a multivariate normal example
and a case study in implementing an ordered simplex data type for use with
Hamiltonian Monte Carlo. The SBC variant introduced in this paper is
implemented in the $\mathtt{SBC}$ R package.</abstract><doi>10.48550/arxiv.2211.02383</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2211.02383 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2211_02383 |
source | arXiv.org |
subjects | Statistics - Methodology |
title | Simulation-Based Calibration Checking for Bayesian Computation: The Choice of Test Quantities Shapes Sensitivity |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T10%3A41%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Simulation-Based%20Calibration%20Checking%20for%20Bayesian%20Computation:%20The%20Choice%20of%20Test%20Quantities%20Shapes%20Sensitivity&rft.au=Modr%C3%A1k,%20Martin&rft.date=2022-11-04&rft_id=info:doi/10.48550/arxiv.2211.02383&rft_dat=%3Carxiv_GOX%3E2211_02383%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |