To read this content please select one of the options below:

Fully Nonparametric Bayesian Additive Regression Trees

aUniversity of Pennsylvania, USA
bMedical College of Wisconsin, USA
cArizona State University, USA

Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B

ISBN: 978-1-83867-420-5, eISBN: 978-1-83867-419-9

Publication date: 18 October 2019

Abstract

Bayesian additive regression trees (BART) is a fully Bayesian approach to modeling with ensembles of trees. BART can uncover complex regression functions with high-dimensional regressors in a fairly automatic way and provide Bayesian quantification of the uncertainty through the posterior. However, BART assumes independent and identical distributed (i.i.d) normal errors. This strong parametric assumption can lead to misleading inference and uncertainty quantification. In this chapter we use the classic Dirichlet process mixture (DPM) mechanism to nonparametrically model the error distribution. A key strength of BART is that default prior settings work reasonably well in a variety of problems. The challenge in extending BART is to choose the parameters of the DPM so that the strengths of the standard BART approach is not lost when the errors are close to normal, but the DPM has the ability to adapt to non-normal errors.

Citation

George, E., Laud, P., Logan, B., McCulloch, R. and Sparapani, R. (2019), "Fully Nonparametric Bayesian Additive Regression Trees", Topics in Identification, Limited Dependent Variables, Partial Observability, Experimentation, and Flexible Modeling: Part B (Advances in Econometrics, Vol. 40B), Emerald Publishing Limited, Leeds, pp. 89-110. https://doi.org/10.1108/S0731-90532019000040B006

Publisher

:

Emerald Publishing Limited

Copyright © 2019 Emerald Publishing Limited