Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Using Iterated Bagging to Debias Regressions
Machine Learning
Machine Learning
Improving nonparametric regression methods by bagging and boosting
Computational Statistics & Data Analysis - Nonlinear methods and data mining
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
The Journal of Machine Learning Research
On the rate of convergence of regularized boosting classifiers
The Journal of Machine Learning Research
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
CART algorithm for spatial data: Application to environmental and ecological data
Computational Statistics & Data Analysis
Computational Statistics & Data Analysis
Robust tree-based incremental imputation method for data fusion
IDA'07 Proceedings of the 7th international conference on Intelligent data analysis
Hi-index | 0.03 |
The AdaBoost like algorithm for boosting CART regression trees is considered. The boosting predictors sequence is analysed on various data sets and the behaviour of the algorithm is investigated. An instability index of a given estimation method with respect to some training sample is defined. Based on the bagging algorithm, this instability index is then extended to quantify the additional instability provided by the boosting process with respect to the bagging one. Finally, the ability of boosting to track outliers and to concentrate on hard observations is used to explore a non-standard regression context.