• gbm

  • Referenced in 60 articles [sw07994]
  • implements extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine ... logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning...
  • AdaCost

  • Referenced in 30 articles [sw33192]
  • cost-sensitive boosting. AdaCost, a variant of AdaBoost, is a misclassification cost-sensitive boosting method ... reduce the cumulative misclassification cost more than AdaBoost. We formally show that AdaCost reduces ... reduction in the cumulative misclassification cost over AdaBoost without consuming additional computing power...
  • AdaBoost-SAMME

  • Referenced in 27 articles [sw19134]
  • Multi-class AdaBoost. Boosting has been a very successful technique for solving the two-class ... algorithm that directly extends the AdaBoost algorithm to the multi-class case without reducing ... show that the proposed multi-class AdaBoost algorithm is equivalent to a forward stagewise additive...
  • SMOTEBoost

  • Referenced in 35 articles [sw12571]
  • SMOTE and the standard boosting procedure AdaBoost to better model the minority class by providing...
  • LogitBoost

  • Referenced in 18 articles [sw08543]
  • that the boosting-like algorithms, such as AdaBoost and many of its modifications, may over...
  • AdaBoost.RT

  • Referenced in 14 articles [sw08520]
  • preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal...
  • adabag

  • Referenced in 7 articles [sw08024]
  • adabag: Applies multiclass AdaBoost.M1, AdaBoost-SAMME and Bagging. This package implements Freund and Schapire ... features were introduced on version 3.0, AdaBoost-SAMME (Zhu et al., 2009) is implemented...
  • RBoost

  • Referenced in 4 articles [sw29975]
  • Function and the Numerically Stable Base Learners. AdaBoost has attracted much attention in the machine ... combining weak classifiers into strong classifiers. However, AdaBoost tends to overfit to the noisy data ... applications. Accordingly, improving the antinoise ability of AdaBoost plays an important role in many applications ... sensitiveness to the noisy data of AdaBoost stems from the exponential loss function, which puts...
  • gBoost

  • Referenced in 8 articles [sw42199]
  • that progressively collects informative patterns. Compared to AdaBoost, gBoost can build the prediction rule with...
  • SABoost

  • Referenced in 6 articles [sw36832]
  • smooth (early) stopping rule. The performance of AdaBoost is compared and contrasted...
  • PromoterExplorer

  • Referenced in 2 articles [sw35524]
  • effective promoter identification method based on the AdaBoost algorithm. Motivation: Promoter prediction is important ... high-dimensional input vector. A cascade AdaBoost-based learning procedure is adopted to select...
  • JOUSBoost

  • Referenced in 3 articles [sw33874]
  • used with machine learning methods such as AdaBoost, random forests...
  • GA-Ensemble

  • Referenced in 1 article [sw11385]
  • difficult to interpret. Some boosting methods, including AdaBoost, are also very sensitive to outliers ... test set error rates of GA-Ensemble, AdaBoost, and GentleBoost (an outlier-resistant version ... AdaBoost) using several artificial data sets and real-world data sets from the UC-Irvine ... results in simpler predictive models than AdaBoost and GentleBoost...
  • fastAdaboost

  • Referenced in 1 article [sw33873]
  • package fastAdaboost: a Fast Implementation of Adaboost. Implements Adaboost based on C++ backend code. This ... implements the Adaboost.M1 algorithm and the real Adaboost(SAMME.R) algorithm...
  • gmb

  • Referenced in 1 article [sw14565]
  • extensions to Freund and Schapire’s AdaBoost algorithm and Friedman’s gradient boosting machine. Includes ... logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning...
  • StatPatternRecognition

  • Referenced in 1 article [sw15007]
  • analysis, decision trees, bump hunting (PRIM), boosting (AdaBoost), bagging and random forest algorithms, and interfaces...
  • deFuse

  • Referenced in 1 article [sw38447]
  • have trained an adaboost classifier on 11 novel features of the sequence data. The resulting...
  • mAHTPred

  • Referenced in 1 article [sw42978]
  • utilized six different ML algorithms, namely, Adaboost, extremely randomized tree (ERT), gradient boosting...
  • PyFeat

  • Referenced in 0 articles [sw36968]
  • provide more local information. We then employ AdaBoost technique to select features with maximum discriminatory...
  • Meta-i6mA

  • Referenced in 0 articles [sw37446]
  • randomized tree, logistic regression, naïve Bayes and AdaBoost). The Rosaceae genome was employed to train...