Machine Learning
Averaging regularized estimators
Neural Computation
Prediction games and arcing algorithms
Neural Computation
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Recycling data for multi-agent learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Rapid and brief communication: FuzzyBagging: A novel ensemble of classifiers
Pattern Recognition
Exact bootstrap k-nearest neighbor learners
Machine Learning
Zone analysis: a visualization framework for classification problems
Artificial Intelligence Review
Bagging different instead of similar models for regression and classification problems
International Journal of Computer Applications in Technology
Two bagging algorithms with coupled learners to encourage diversity
IDA'07 Proceedings of the 7th international conference on Intelligent data analysis
Bagging with asymmetric costs for misclassified and correctly classified examples
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
Leveraging bagging for evolving data streams
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part I
Parallel Approach for Ensemble Learning with Locally Coupled Neural Networks
Neural Processing Letters
Combining bagging, boosting, rotation forest and random subspace methods
Artificial Intelligence Review
Bagging classifiers for fighting poisoning attacks in adversarial classification tasks
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Training regression ensembles by sequential target correction and resampling
Information Sciences: an International Journal
Local negative correlation with resampling
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Identification of spatial and temporal features of EEG
Neurocomputing
Ensemble of tensor classifiers based on the higher-order singular value decomposition
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II
CISIM'12 Proceedings of the 11th IFIP TC 8 international conference on Computer Information Systems and Industrial Management
The Journal of Machine Learning Research
Hi-index | 0.00 |
Bagging constructs an estimator by averaging predictors trained on bootstrap samples. Bagged estimates almost consistently improve on the original predictor. It is thus important to understand the reasons for this success, and also for the occasional failures. It is widely believed that bagging is effective thanks to the variance reduction stemming from averaging predictors. However, seven years from its introduction, bagging is still not fully understood. This paper provides experimental evidence supporting the hypothesis that bagging stabilizes prediction by equalizing the influence of training examples. This effect is detailed in two different frameworks: estimation on the real line and regression. Bagging's improvements/deteriorations are explained by the goodness/badness of highly influential examples, in situations where the usual variance reduction argument is at best questionable. Finally, reasons for the equalization effect are advanced. They support that other resampling strategies such as half-sampling should provide qualitatively identical effects while being computationally less demanding than bootstrap sampling.