Multi-class AdaBoost. Boosting has been a very successful technique for solving the two-class classification problem. In going from two-class to multi-class classification, most algorithms have been restricted to reducing the multi-class classification problem to multiple two-class problems. We develop a new algorithm that directly extends the AdaBoost algorithm to the multi-class case without reducing it to multiple two-class problems. We show that the proposed multi-class AdaBoost algorithm is equivalent to a forward stagewise additive modeling algorithm that minimizes a novel exponential loss for multi-class classification. Furthermore, we show that the exponential loss is a member of a class of Fisher-consistent loss functions for multi-class classification. As shown in this paper, the new algorithm is extremely easy to implement and is highly competitive in terms of misclassification error rate.

References in zbMATH (referenced in 28 articles )

Showing results 1 to 20 of 28.
Sorted by year (citations)

1 2 next

  1. Carrizosa, Emilio; Molero-Río, Cristina; Romero Morales, Dolores: Mathematical optimization in classification and regression trees (2021)
  2. Feng, Chen; Griffin, Paul; Kethireddy, Shravan; Mei, Yajun: A boosting inspired personalized threshold method for sepsis screening (2021)
  3. Su, Jinsong; Tang, Jialong; Jiang, Hui; Lu, Ziyao; Ge, Yubin; Song, Linfeng; Xiong, Deyi; Sun, Le; Luo, Jiebo: Enhanced aspect-based sentiment analysis models with progressive self-supervised attention learning (2021)
  4. Xie, Yunxin; Zhu, Chenyang; Hu, Runshan; Zhu, Zhengwei: A coarse-to-fine approach for intelligent logging lithology identification with extremely randomized trees (2021)
  5. Yang, Yi; Guo, Yuxuan; Chang, Xiangyu: Angle-based cost-sensitive multicategory classification (2021)
  6. Ye, Qing Chuan; Rhuggenaath, Jason; Zhang, Yingqian; Verwer, Sicco; Hilgeman, Michiel Jurgen: Data driven design for online industrial auctions (2021)
  7. Bauvin, Baptiste; Capponi, Cécile; Roy, Jean-Francis; Laviolette, François: Fast greedy (\mathcalC)-bound minimization with guarantees (2020)
  8. Han, Sunwoo; Kim, Hyunjoong; Lee, Yung-Seop: Double random forest (2020)
  9. Aravkin, Aleksandr Y.; Bottegal, Giulio; Pillonetto, Gianluigi: Boosting as a kernel-based method (2019)
  10. Blachnik, Marcin: Ensembles of instance selection methods: a comparative study (2019)
  11. Drotár, Peter; Gazda, Matej; Vokorokos, Liberios: Ensemble feature selection using election methods and ranker clustering (2019)
  12. Tran, Anh; Sun, Jing; Furlan, John M.; Pagalthivarthi, Krishnan V.; Visintainer, Robert J.; Wang, Yan: pBO-2GP-3B: a batch parallel known/unknown constrained Bayesian optimization with feasibility classification and its applications in computational fluid dynamics (2019)
  13. Wang, Xin; Zhang, Hao Helen; Wu, Yichao: Multiclass probability estimation with support vector machines (2019)
  14. Al-Rawabdeh, Wasfi A.; Dalalah, Doraid: Predictive decision making under risk and uncertainty: a support vector machines model (2017)
  15. Li, Zhen; Wu, Wei: Reversible data hiding for encrypted images based on statistical learning (2016)
  16. Neykov, Matey; Liu, Jun S.; Cai, Tianxi: On the characterization of a class of Fisher-consistent loss functions and its application to boosting (2016)
  17. Geist, Matthieu: Soft-max boosting (2015)
  18. Chen, Yu-Chuan; Ha, Hyejung; Kim, Hyunjoong; Ahn, Hongshik: Canonical forest (2014)
  19. Fernández-Baldera, Antonio; Baumela, Luis: Multi-class boosting with asymmetric binary weak-learners (2014)
  20. Nie, Qingfeng; Jin, Lizuo; Fei, Shumin: Probability estimation for multi-class classification using adaboost (2014)

1 2 next