AdaBoost.RT

Experiments with AdaBoost.RT, an improved boosting scheme for regression. The application of boosting technique to regression problems has received relatively little attention in contrast to research aimed at classification problems. This letter describes a new boosting algorithm, AdaBoost.RT, for regression problems. Its idea is in filtering out the examples with the relative estimation error that is higher than the preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal value of the error threshold to demarcate examples as poorly or well predicted. Some experimental results using the M5 model tree as a weak learning machine for several benchmark data sets are reported. The results are compared to other boosting methods, bagging, artificial neural networks, and a single M5 model tree. The preliminary empirical comparisons show higher performance of AdaBoost.RT for most of the considered data sets.


References in zbMATH (referenced in 12 articles )

Showing results 1 to 12 of 12.
Sorted by year (citations)

  1. Zhang, Chun-Xia; Zhang, Jiang-She; Kim, Sang-Woon: PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection (2016)
  2. Jackowski, Konrad: Adaptive splitting and selection algorithm for regression (2015) ioport
  3. Alhamdoosh, Monther; Wang, Dianhui: Fast decorrelated neural network ensembles with random weights (2014)
  4. Kim, Hyun Hak; Swanson, Norman R.: Forecasting financial and macroeconomic variables using data reduction methods: new empirical evidence (2014)
  5. Pardo, Carlos; Diez-Pastor, José F.; García-Osorio, César; Rodríguez, Juan J.: Rotation Forests for regression (2013)
  6. Techo, Jakkrit; Nattee, Cholwich; Theeramunkong, Thanaruk: Boosting-based ensemble learning with penalty profiles for automatic Thai unknown word recognition (2012) ioport
  7. Bailly, Kevin; Milgram, Maurice: Boosting feature selection for neural network based regression (2009) ioport
  8. Zhang, Chun-Xia; Zhang, Jiang-She; Zhang, Gai-Ying: Using boosting to prune double-bagging ensembles (2009)
  9. Zhang, Chun-Xia; Zhang, Jiang-She; Wang, Guan-Wei: An empirical study of using Rotation Forest to improve regressors (2008)
  10. Shrestha, D. L.; Solomatine, D. P.: Experiments with AdaBoost.RT, an improved boosting scheme for regression (2006)
  11. Shrestha, Durga L.; Solomatine, Dimitri P.: Machine learning approaches for estimation of prediction interval for the model output. (2006)
  12. Solomatine, D. P.; Siek, M. B.: Modular learning models in forecasting natural phenomena. (2006)