Bayesian additive regression trees and the General BART model

Bayesian additive regression trees (BART) is a flexible prediction model/machine learning approach that has gained widespread popularity in recent years. As BART becomes more mainstream, there is an increased need for a paper that walks readers through the details of BART, from what it is to why it...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Tan, Yaoyuan Vincent, Roy, Jason
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Tan, Yaoyuan Vincent
Roy, Jason
description Bayesian additive regression trees (BART) is a flexible prediction model/machine learning approach that has gained widespread popularity in recent years. As BART becomes more mainstream, there is an increased need for a paper that walks readers through the details of BART, from what it is to why it works. This tutorial is aimed at providing such a resource. In addition to explaining the different components of BART using simple examples, we also discuss a framework, the General BART model, that unifies some of the recent BART extensions, including semiparametric models, correlated outcomes, statistical matching problems in surveys, and models with weaker distributional assumptions. By showing how these models fit into a single framework, we hope to demonstrate a simple way of applying BART to research problems that go beyond the original independent continuous or binary outcomes framework.
doi_str_mv 10.48550/arxiv.1901.07504
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1901_07504</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1901_07504</sourcerecordid><originalsourceid>FETCH-LOGICAL-a1154-518930f45e407a97bace730ecffb9e2cfe778d25eb9d37630f5934ea51de8de13</originalsourceid><addsrcrecordid>eNotz8FKAzEUheFsXEj1AVyZF5gxaXLNZOGiLVqFQkHqergzOdHAdCrJUOzbq9XV2fwc-IS40aq2DZG64_yVjrX2StfKkbKX4mHJJ5TEo-QQ0pSOkBnvGaWkwyinDBTJY5DTB-QaIzIPcrl43cn9IWC4EheRh4Lr_52Jt6fH3eq52mzXL6vFpmKtyVakG29UtASrHHvXcQ9nFPoYO495H-FcE-aEzgfj7n9S8saCSQc0AdrMxO3f7xnQfua053xqfyHtGWK-ASgmQtY</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Bayesian additive regression trees and the General BART model</title><source>arXiv.org</source><creator>Tan, Yaoyuan Vincent ; Roy, Jason</creator><creatorcontrib>Tan, Yaoyuan Vincent ; Roy, Jason</creatorcontrib><description>Bayesian additive regression trees (BART) is a flexible prediction model/machine learning approach that has gained widespread popularity in recent years. As BART becomes more mainstream, there is an increased need for a paper that walks readers through the details of BART, from what it is to why it works. This tutorial is aimed at providing such a resource. In addition to explaining the different components of BART using simple examples, we also discuss a framework, the General BART model, that unifies some of the recent BART extensions, including semiparametric models, correlated outcomes, statistical matching problems in surveys, and models with weaker distributional assumptions. By showing how these models fit into a single framework, we hope to demonstrate a simple way of applying BART to research problems that go beyond the original independent continuous or binary outcomes framework.</description><identifier>DOI: 10.48550/arxiv.1901.07504</identifier><language>eng</language><subject>Statistics - Applications</subject><creationdate>2019-01</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-a1154-518930f45e407a97bace730ecffb9e2cfe778d25eb9d37630f5934ea51de8de13</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1901.07504$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1901.07504$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Tan, Yaoyuan Vincent</creatorcontrib><creatorcontrib>Roy, Jason</creatorcontrib><title>Bayesian additive regression trees and the General BART model</title><description>Bayesian additive regression trees (BART) is a flexible prediction model/machine learning approach that has gained widespread popularity in recent years. As BART becomes more mainstream, there is an increased need for a paper that walks readers through the details of BART, from what it is to why it works. This tutorial is aimed at providing such a resource. In addition to explaining the different components of BART using simple examples, we also discuss a framework, the General BART model, that unifies some of the recent BART extensions, including semiparametric models, correlated outcomes, statistical matching problems in surveys, and models with weaker distributional assumptions. By showing how these models fit into a single framework, we hope to demonstrate a simple way of applying BART to research problems that go beyond the original independent continuous or binary outcomes framework.</description><subject>Statistics - Applications</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz8FKAzEUheFsXEj1AVyZF5gxaXLNZOGiLVqFQkHqergzOdHAdCrJUOzbq9XV2fwc-IS40aq2DZG64_yVjrX2StfKkbKX4mHJJ5TEo-QQ0pSOkBnvGaWkwyinDBTJY5DTB-QaIzIPcrl43cn9IWC4EheRh4Lr_52Jt6fH3eq52mzXL6vFpmKtyVakG29UtASrHHvXcQ9nFPoYO495H-FcE-aEzgfj7n9S8saCSQc0AdrMxO3f7xnQfua053xqfyHtGWK-ASgmQtY</recordid><startdate>20190122</startdate><enddate>20190122</enddate><creator>Tan, Yaoyuan Vincent</creator><creator>Roy, Jason</creator><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20190122</creationdate><title>Bayesian additive regression trees and the General BART model</title><author>Tan, Yaoyuan Vincent ; Roy, Jason</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a1154-518930f45e407a97bace730ecffb9e2cfe778d25eb9d37630f5934ea51de8de13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Statistics - Applications</topic><toplevel>online_resources</toplevel><creatorcontrib>Tan, Yaoyuan Vincent</creatorcontrib><creatorcontrib>Roy, Jason</creatorcontrib><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Tan, Yaoyuan Vincent</au><au>Roy, Jason</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Bayesian additive regression trees and the General BART model</atitle><date>2019-01-22</date><risdate>2019</risdate><abstract>Bayesian additive regression trees (BART) is a flexible prediction model/machine learning approach that has gained widespread popularity in recent years. As BART becomes more mainstream, there is an increased need for a paper that walks readers through the details of BART, from what it is to why it works. This tutorial is aimed at providing such a resource. In addition to explaining the different components of BART using simple examples, we also discuss a framework, the General BART model, that unifies some of the recent BART extensions, including semiparametric models, correlated outcomes, statistical matching problems in surveys, and models with weaker distributional assumptions. By showing how these models fit into a single framework, we hope to demonstrate a simple way of applying BART to research problems that go beyond the original independent continuous or binary outcomes framework.</abstract><doi>10.48550/arxiv.1901.07504</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1901.07504
ispartof
issn
language eng
recordid cdi_arxiv_primary_1901_07504
source arXiv.org
subjects Statistics - Applications
title Bayesian additive regression trees and the General BART model
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T02%3A15%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Bayesian%20additive%20regression%20trees%20and%20the%20General%20BART%20model&rft.au=Tan,%20Yaoyuan%20Vincent&rft.date=2019-01-22&rft_id=info:doi/10.48550/arxiv.1901.07504&rft_dat=%3Carxiv_GOX%3E1901_07504%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true