
gbm
 Referenced in 60 articles
[sw07994]
 implements extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine ... logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning...

AdaCost
 Referenced in 30 articles
[sw33192]
 costsensitive boosting. AdaCost, a variant of AdaBoost, is a misclassification costsensitive boosting method ... reduce the cumulative misclassification cost more than AdaBoost. We formally show that AdaCost reduces ... reduction in the cumulative misclassification cost over AdaBoost without consuming additional computing power...

AdaBoostSAMME
 Referenced in 27 articles
[sw19134]
 Multiclass AdaBoost. Boosting has been a very successful technique for solving the twoclass ... algorithm that directly extends the AdaBoost algorithm to the multiclass case without reducing ... show that the proposed multiclass AdaBoost algorithm is equivalent to a forward stagewise additive...

SMOTEBoost
 Referenced in 35 articles
[sw12571]
 SMOTE and the standard boosting procedure AdaBoost to better model the minority class by providing...

LogitBoost
 Referenced in 18 articles
[sw08543]
 that the boostinglike algorithms, such as AdaBoost and many of its modifications, may over...

AdaBoost.RT
 Referenced in 14 articles
[sw08520]
 preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal...

adabag
 Referenced in 7 articles
[sw08024]
 adabag: Applies multiclass AdaBoost.M1, AdaBoostSAMME and Bagging. This package implements Freund and Schapire ... features were introduced on version 3.0, AdaBoostSAMME (Zhu et al., 2009) is implemented...

RBoost
 Referenced in 4 articles
[sw29975]
 Function and the Numerically Stable Base Learners. AdaBoost has attracted much attention in the machine ... combining weak classifiers into strong classifiers. However, AdaBoost tends to overfit to the noisy data ... applications. Accordingly, improving the antinoise ability of AdaBoost plays an important role in many applications ... sensitiveness to the noisy data of AdaBoost stems from the exponential loss function, which puts...

gBoost
 Referenced in 8 articles
[sw42199]
 that progressively collects informative patterns. Compared to AdaBoost, gBoost can build the prediction rule with...

SABoost
 Referenced in 6 articles
[sw36832]
 smooth (early) stopping rule. The performance of AdaBoost is compared and contrasted...

PromoterExplorer
 Referenced in 2 articles
[sw35524]
 effective promoter identification method based on the AdaBoost algorithm. Motivation: Promoter prediction is important ... highdimensional input vector. A cascade AdaBoostbased learning procedure is adopted to select...

JOUSBoost
 Referenced in 3 articles
[sw33874]
 used with machine learning methods such as AdaBoost, random forests...

GAEnsemble
 Referenced in 1 article
[sw11385]
 difficult to interpret. Some boosting methods, including AdaBoost, are also very sensitive to outliers ... test set error rates of GAEnsemble, AdaBoost, and GentleBoost (an outlierresistant version ... AdaBoost) using several artificial data sets and realworld data sets from the UCIrvine ... results in simpler predictive models than AdaBoost and GentleBoost...

fastAdaboost
 Referenced in 1 article
[sw33873]
 package fastAdaboost: a Fast Implementation of Adaboost. Implements Adaboost based on C++ backend code. This ... implements the Adaboost.M1 algorithm and the real Adaboost(SAMME.R) algorithm...

gmb
 Referenced in 1 article
[sw14565]
 extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine. Includes ... logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning...

StatPatternRecognition
 Referenced in 1 article
[sw15007]
 analysis, decision trees, bump hunting (PRIM), boosting (AdaBoost), bagging and random forest algorithms, and interfaces...

deFuse
 Referenced in 1 article
[sw38447]
 have trained an adaboost classifier on 11 novel features of the sequence data. The resulting...

mAHTPred
 Referenced in 1 article
[sw42978]
 utilized six different ML algorithms, namely, Adaboost, extremely randomized tree (ERT), gradient boosting...

PyFeat
 Referenced in 0 articles
[sw36968]
 provide more local information. We then employ AdaBoost technique to select features with maximum discriminatory...

Metai6mA
 Referenced in 0 articles
[sw37446]
 randomized tree, logistic regression, naïve Bayes and AdaBoost). The Rosaceae genome was employed to train...