adabag
adabag: Applies multiclass AdaBoost.M1, AdaBoost-SAMME and Bagging. This package implements Freund and Schapire’s Adaboost.M1 algorithm and Breiman’s Bagging algorithm using classification trees as individual classifiers. Once these classifiers have been trained, they can be used to predict on new data. Also, cross validation predictions can be done. Since version 2.0 the function ”margins” is available to calculate the margins for these classifiers. Also a higher flexibility is achieved giving access to the ”rpart.control” argument of ”rpart”. Four important new features were introduced on version 3.0, AdaBoost-SAMME (Zhu et al., 2009) is implemented and a new function ”errorevol” shows the error of the ensembles as a function of the number of iterations. In addition, the ensembles can be pruned using the option ”newmfinal” in the predict.bagging and predict.boosting functions and the posterior probability of each class for observations can be obtained. Version 3.1 modifies the relative importance measure to take into account the gain of the Gini index given by a variable in each tree and the weights of these trees.
Keywords for this software
References in zbMATH (referenced in 6 articles , 1 standard article )
Showing results 1 to 6 of 6.
Sorted by year (- Gong, Joonho; Kim, Hyunjoong: Rhsboost: improving classification performance in imbalance data (2017)
- Weihs, Claus; Mersmann, Olaf; Ligges, Uwe: Foundations of statistical algorithms. With references to R packages (2014)
- Esteban Alfaro; Matias Gamez; Noelia García: adabag: An R Package for Classification with Boosting and Bagging (2013) not zbMATH
- De Bock, Koen W.; Coussement, Kristof; Van den Poel, Dirk: Ensemble classification based on generalized additive models (2010)
- Murphy, Thomas Brendan; Dean, Nema; Raftery, Adrian E.: Variable selection and updating in model-based discriminant analysis for high dimensional data with food authenticity applications (2010)
- Zhu, Ji; Zou, Hui; Rosset, Saharon; Hastie, Trevor: Multi-class AdaBoost (2009)