BayesTree

BayesTree: Bayesian Methods for Tree Based Models: Implementation of BART: Bayesian Additive Regression Trees. We develop a Bayesian “sum-of-trees” model where each tree is constrained by a regularization prior to be a weak learner, and fitting and inference are accomplished via an iterative Bayesian backfitting MCMC algorithm that generates samples from a posterior. Effectively, BART is a nonparametric Bayesian regression approach which uses dimensionally adaptive random basis elements. Motivated by ensemble methods in general, and boosting algorithms in particular, BART is defined by a statistical model: a prior and a likelihood. This approach enables full posterior inference including point and interval estimates of the unknown regression function as well as the marginal effects of potential predictors. By keeping track of predictor inclusion frequencies, BART can also be used for model-free variable selection. BART’s many features are illustrated with a bake-off against competing methods on 42 different data sets, with a simulation experiment and on a drug discovery classification problem.


References in zbMATH (referenced in 41 articles , 1 standard article )

Showing results 1 to 20 of 41.
Sorted by year (citations)

1 2 3 next

  1. Hernández, Belinda; Raftery, Adrian E.; Pennington, Stephen R.; Parnell, Andrew C.: Bayesian additive regression trees using Bayesian model averaging (2018)
  2. Conversano, Claudio; Dusseldorp, Elise: Modeling threshold interaction effects through the logistic classification trunk (2017)
  3. Guo, Wentian; Ji, Yuan; Catenacci, Daniel V.T.: A subgroup cluster-based Bayesian adaptive design for precision medicine (2017)
  4. Hu, Ruimeng; Ludkovsk, Mike: Sequential design for ranking response surfaces (2017)
  5. Adam Kapelner and Justin Bleich: bartMachine: Machine Learning with Bayesian Additive Regression Trees (2016)
  6. Geurts, Pierre; Wehenkel, Louis: Comments on: “A random forest guided tour” (2016)
  7. Gray-Davies, Tristan; Holmes, Chris C.; Caron, François: Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks (2016)
  8. Guhaniyogi, Rajarshi; Dunson, David B.: Compressed Gaussian process for manifold regression (2016)
  9. Mentch, Lucas; Hooker, Giles: Quantifying uncertainty in random forests via confidence intervals and hypothesis tests (2016)
  10. Wager, Stefan: Comments on: “A random forest guided tour” (2016)
  11. Green, Peter J.; Łatuszyński, Krzysztof; Pereyra, Marcelo; Robert, Christian P.: Bayesian computation: a summary of the current state, and samples backwards and forwards (2015)
  12. Kapelner, Adam; Bleich, Justin: Prediction with missing data via Bayesian additive regression trees (2015)
  13. Letham, Benjamin; Rudin, Cynthia; McCormick, Tyler H.; Madigan, David: Interpretable classifiers using rules and Bayesian analysis: building a better stroke prediction model (2015)
  14. Low-Kam, Cecile; Telesca, Donatello; Ji, Zhaoxia; Zhang, Haiyuan; Xia, Tian; Zink, Jeffrey I.; Nel, Andre E.: A Bayesian regression tree approach to identify the effect of nanoparticles’ properties on toxicity profiles (2015)
  15. Quintana, Fernando A.; Müller, Peter; Papoila, Ana Luisa: Cluster-specific variable selection for product partition models (2015)
  16. Steorts, Rebecca C.: Entity resolution with empirically motivated priors (2015)
  17. Yang, Yun; Tokdar, Surya T.: Minimax-optimal nonparametric regression in high dimensions (2015)
  18. Bleich, Justin; Kapelner, Adam; George, Edward I.; Jensen, Shane T.: Variable selection for BART: an application to gene regulation (2014)
  19. Kapelner, Adam; Krieger, Abba: Matching on-the-fly: sequential allocation with higher power and efficiency (2014)
  20. Pati, Debdeep; Dunson, David B.: Bayesian nonparametric regression with varying residual density (2014)

1 2 3 next