Communications of the ACM
The Strength of Weak Learnability
Machine Learning
An introduction to computational learning theory
An introduction to computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting regression estimators
Neural Computation
Prediction games and arcing algorithms
Neural Computation
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Data Mining
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
2006 Special issue: Modular learning models in forecasting natural phenomena
Neural Networks - 2006 special issue: Earth sciences and environmental applications of computational intelligence
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
Semi-supervised Learning for Regression with Co-training by Committee
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Boosted Neural Networks in Evolutionary Computation
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
BISAR: boosted input selection algorithm for regression
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Feature selection for content-based, time-varying musical emotion regression
Proceedings of the international conference on Multimedia information retrieval
Parallel Approach for Ensemble Learning with Locally Coupled Neural Networks
Neural Processing Letters
An empirical study of multilayer perceptron ensembles for regression tasks
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part II
Boosting-based ensemble learning with penalty profiles for automatic Thai unknown word recognition
Computers & Mathematics with Applications
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Regression-based intensity estimation of facial action units
Image and Vision Computing
Genetic ensemble of extreme learning machine
Neurocomputing
On a method for constructing ensembles of regression models
Automation and Remote Control
Fast decorrelated neural network ensembles with random weights
Information Sciences: an International Journal
Hi-index | 0.00 |
The application of boosting technique to regression problems has received relatively little attention in contrast to research aimed at classification problems. This letter describes a new boosting algorithm, AdaBoost.RT, for regression problems. Its idea is in filtering out the examples with the relative estimation error that is higher than the preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal value of the error threshold to demarcate examples as poorly or well predicted. Some experimental results using the M5 model tree as a weak learning machine for several benchmark data sets are reported. The results are compared to other boosting methods, bagging, artificial neural networks, and a single M5 model tree. The preliminary empirical comparisons show higher performance of AdaBoost.RT for most of the considered data sets.