Neural networks and the bias/variance dilemma
Neural Computation
Machine Learning
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Variance and Bias for General Loss Functions
Machine Learning
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Accuracy/Diversity and Ensemble MLP Classifier Design
IEEE Transactions on Neural Networks
Tomographic considerations in ensemble bias/variance decomposition
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Class-Separability weighting and bootstrapping in error correcting output code ensembles
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Hi-index | 0.01 |
By performing experiments on publicly available multi-class datasets we examine the effect of bootstrapping on the bias/variance behaviour of error-correcting output code ensembles. We present evidence to show that the general trend is for bootstrapping to reduce variance but to slightly increase bias error. This generally leads to an improvement in the lowest attainable ensemble error, however this is not always the case and bootstrapping appears to be most useful on datasets where the non-bootstrapped ensemble classifier is prone to overfitting.