gbm

gbm: Generalized Boosted Regression Models. This package implements extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, multinomial logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMart).


References in zbMATH (referenced in 44 articles )

Showing results 1 to 20 of 44.
Sorted by year (citations)

1 2 3 next

  1. Berk, Richard A.: Statistical learning from a regression perspective (2020)
  2. Chu, Jianghao; Lee, Tae-Hwy; Ullah, Aman: Component-wise AdaBoost algorithms for high-dimensional binary classification and class probability prediction (2020)
  3. Alireza S. Mahani; Mansour T.A. Sharabiani: Bayesian, and Non-Bayesian, Cause-Specific Competing-Risk Analysis for Parametric and Nonparametric Survival Functions: The R Package CFC (2019) not zbMATH
  4. Biau, G.; Cadre, B.; Rouvière, L.: Accelerated gradient boosting (2019)
  5. Cerqueira, Vitor; Torgo, Luís; Pinto, Fábio; Soares, Carlos: Arbitrage of forecasting experts (2019)
  6. Choi, Byeong Yeob; Wang, Chen-Pin; Michalek, Joel; Gelfond, Jonathan: Power comparison for propensity score methods (2019)
  7. Ramosaj, Burim; Pauly, Markus: Predicting missing values: a comparative study on non-parametric approaches for imputation (2019)
  8. Tu, Chunhao: Comparison of various machine learning algorithms for estimating generalized propensity score (2019)
  9. Au, Timothy C.: Random forests, decision trees, and categorical predictors: the “absent levels” problem (2018)
  10. Lee, Simon C. K.; Lin, Sheldon: Delta boosting machine with application to general insurance (2018)
  11. Quan, Zhiyu; Valdez, Emiliano A.: Predictive analytics of insurance claims using multivariate decision trees (2018)
  12. Yukinobu Hamuro; Masakazu Nakamoto; Stephane Cheung; Edward Ip: mbonsai: Application Package for Sequence Classification by Tree Methodology (2018) not zbMATH
  13. Wauters, Mathieu; Vanhoucke, Mario: A nearest neighbour extension to project duration forecasting with artificial intelligence (2017)
  14. Blaser, Rico; Fryzlewicz, Piotr: Random rotation ensembles (2016)
  15. De Bin, Riccardo: Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textitCoxBoostand \textitmboost (2016)
  16. Dubossarsky, E.; Friedman, J. H.; Ormerod, J. T.; Wand, M. P.: Wavelet-based gradient boosting (2016)
  17. Li, Lin; Li, Yang; Qin, Yichen; Chen, Jiaxu; Wang, Limin; Yi, Danhui: Adaptive stochastic gradient boosting tree with composite criterion (2016)
  18. Yılmaz Isıkhan, Selen; Karabulut, Erdem; Alpar, Celal Reha: Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data (2016)
  19. Hadiji, Fabian; Molina, Alejandro; Natarajan, Sriraam; Kersting, Kristian: Poisson dependency networks: gradient boosted models for multivariate count data (2015)
  20. Kotthaus, Helena; Korb, Ingo; Lang, Michel; Bischl, Bernd; Rahnenführer, Jörg; Marwedel, Peter: Runtime and memory consumption analyses for machine learning R programs (2015)

1 2 3 next