AdaBoost.MH

A decision-theoretic generalization of on-line learning and an application to boosting. In the first part of the paper we consider the problem of dynamically apportioning resources among a set of options in a worst-case on-line framework. The model we study can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting. We show that the multiplicative weight-update Littlestone-Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems. We show how the resulting learning algorithm can be applied to a variety of problems, including gambling, multiple-outcome prediction, repeated games, and prediction of points in $bfR^n$. In the second part of the paper we apply the multiplicative weight-update technique to derive a new boosting algorithm. This boosting algorithm does not require any prior knowledge about the performance of the weak learning algorithm. We also study generalizations of the new boosting algorithm to the problem of learning functions whose range, rather than being binary, is an arbitrary finite set or a bounded segment of the real line.


References in zbMATH (referenced in 430 articles , 1 standard article )

Showing results 1 to 20 of 430.
Sorted by year (citations)

1 2 3 ... 20 21 22 next

  1. Connamacher, Harold; Pancha, Nikil; Liu, Rui; Ray, Soumya: \textscRankboost(+): an improvement to \textscRankboost (2020)
  2. Fujita, Takahiro; Hatano, Kohei; Takimoto, Eiji: Boosting over non-deterministic ZDDs (2020)
  3. Lai, Yuanhao; McLeod, Ian: Ensemble quantile classifier (2020)
  4. Lopes, Miles E.: Estimating a sharp convergence bound for randomized ensembles (2020)
  5. van Engelen, Jesper E.; Hoos, Holger H.: A survey on semi-supervised learning (2020)
  6. Zhu, Qiwu; Xiong, Qingyu; Wang, Kai; Lu, Wang; Liu, Tong: Accurate WiFi-based indoor localization by using fuzzy classifier and mlps ensemble in complex environment (2020)
  7. Aravkin, Aleksandr Y.; Bottegal, Giulio; Pillonetto, Gianluigi: Boosting as a kernel-based method (2019)
  8. Baumann, P.; Hochbaum, D. S.; Yang, Y. T.: A comparative study of the leading machine learning techniques and two new optimization algorithms (2019)
  9. Biau, G.; Cadre, B.; Rouvière, L.: Accelerated gradient boosting (2019)
  10. Blachnik, Marcin: Ensembles of instance selection methods: a comparative study (2019)
  11. Bourel, Mathias; Cugliari, Jairo: Bagging of density estimators (2019)
  12. Cai, Guoqiang; Yang, Chen; Pan, Yue; Lv, Jiaojiao: EMD and GNN-adaboost fault diagnosis for urban rail train rolling bearings (2019)
  13. Conaty, Diarmaid; Martínez del Rincon, Jesús; de Campos, Cassio P.: A hierarchy of sum-product networks using robustness (2019)
  14. Gosztolya, Gábor; Busa-Fekete, Róbert: Calibrating AdaBoost for phoneme classification (2019)
  15. Hanneke, Steve; Yang, Liu: Surrogate losses in passive and active learning (2019)
  16. Han, Xu: Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data (2019)
  17. Lopes, Miles E.: Estimating the algorithmic variance of randomized ensembles via the bootstrap (2019)
  18. Luo, Cheng; Zhang, Bo; Xiang, Yang; Qi, Man: Gaussian-Gamma collaborative filtering: a hierarchical Bayesian model for recommender systems (2019)
  19. Pham, Tuan M.; Doan, Danh C.; Hitzer, Eckhard: Feature extraction using conformal geometric algebra for AdaBoost algorithm based in-plane rotated face detection (2019)
  20. Vandoni, Jennifer; Aldea, Emanuel; Le Hégarat-Mascle, Sylvie: Evidential query-by-committee active learning for Pedestrian detection in high-density crowds (2019)

1 2 3 ... 20 21 22 next