BayesTree: Bayesian Methods for Tree Based Models: Implementation of BART: Bayesian Additive Regression Trees. We develop a Bayesian “sum-of-trees” model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior. Effectively, BART is a nonparametric Bayesian regression approach which uses dimensionally adaptive random basis elements. Motivated by ensemble methods in general, and boosting algorithms in particular, BART is defined by a statistical model: a prior and a likelihood. This approach enables full posterior inference including point and interval estimates of the unknown regression function as well as the marginal effects of potential predictors. By keeping track of predictor inclusion frequencies, BART can also be used for model-free variable selection. BART’s many features are illustrated with a bake-off against competing methods on 42 different data sets, with a simulation experiment and on a drug discovery classification problem.

References in zbMATH (referenced in 26 articles )

Showing results 1 to 20 of 26.
Sorted by year (citations)

1 2 next

  1. Geurts, Pierre; Wehenkel, Louis: Comments on: “A random forest guided tour” (2016)
  2. Guhaniyogi, Rajarshi; Dunson, David B.: Compressed Gaussian process for manifold regression (2016)
  3. Mentch, Lucas; Hooker, Giles: Quantifying uncertainty in random forests via confidence intervals and hypothesis tests (2016)
  4. Wager, Stefan: Comments on: “A random forest guided tour” (2016)
  5. Green, Peter J.; Łatuszyński, Krzysztof; Pereyra, Marcelo; Robert, Christian P.: Bayesian computation: a summary of the current state, and samples backwards and forwards (2015)
  6. Bleich, Justin; Kapelner, Adam; George, Edward I.; Jensen, Shane T.: Variable selection for BART: an application to gene regulation (2014)
  7. Pati, Debdeep; Dunson, David B.: Bayesian nonparametric regression with varying residual density (2014)
  8. Gramacy, Robert B.; Taddy, Matt; Wild, Stefan M.: Variable selection and sensitivity analysis using dynamic trees, with an application to computer code performance tuning (2013)
  9. Hill, Jennifer; Su, Yu-Sung: Assessing lack of common support in causal inference using Bayesian nonparametrics: Implications for evaluating the effect of breastfeeding on children’s cognitive outcomes (2013)
  10. Imai, Kosuke; Ratkovic, Marc: Estimating treatment effect heterogeneity in randomized program evaluation (2013)
  11. Rusch, Thomas; Lee, Ilro; Hornik, Kurt; Jank, Wolfgang; Zeileis, Achim: Influencing elections with statistics: Targeting voters with logistic regression trees (2013)
  12. Chakraborty, Sounak: Bayesian multiple response kernel regression model for high dimensional data and its practical applications in near infrared spectroscopy (2012)
  13. Karabatsos, George; Walker, Stephen G.: Adaptive-modal Bayesian nonparametric regression (2012)
  14. Scheipl, Fabian; Fahrmeir, Ludwig; Kneib, Thomas: Spike-and-slab priors for function selection in structured additive regression models (2012)
  15. Wang, Hao; Reiter, Jerome P.: Multiple imputation for sharing precise geographies in public use data (2012)
  16. Jasra, Ajay; Holmes, Christopher C.: Stochastic boosting algorithms (2011)
  17. Jasra, Ajay; Holmes, Christopher C.: Stochastic boosting algorithms (2011)
  18. Yu, Qingzhao; MacEachern, Steven N.; Peruggia, Mario: Bayesian synthesis: combining subjective analyses, with an application to ozone data (2011)
  19. Chipman, Hugh A.; George, Edward I.; McCulloch, Robert E.: BART: Bayesian additive regression trees (2010)
  20. Lian, Heng: Sparse Bayesian hierarchical modeling of high-dimensional clustering problems (2010)

1 2 next