Bayesian additive regression trees and the General BART model
Bayesian additive regression trees (BART) is a flexible prediction model/machine learning approach that has gained widespread popularity in recent years. As BART becomes more mainstream, there is an increased need for a paper that walks readers through the details of BART, from what it is to why it...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Bayesian additive regression trees (BART) is a flexible prediction
model/machine learning approach that has gained widespread popularity in recent
years. As BART becomes more mainstream, there is an increased need for a paper
that walks readers through the details of BART, from what it is to why it
works. This tutorial is aimed at providing such a resource. In addition to
explaining the different components of BART using simple examples, we also
discuss a framework, the General BART model, that unifies some of the recent
BART extensions, including semiparametric models, correlated outcomes,
statistical matching problems in surveys, and models with weaker distributional
assumptions. By showing how these models fit into a single framework, we hope
to demonstrate a simple way of applying BART to research problems that go
beyond the original independent continuous or binary outcomes framework. |
---|---|
DOI: | 10.48550/arxiv.1901.07504 |