Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Variance and Bias for General Loss Functions
Machine Learning
Database Mining: A Performance Perspective
IEEE Transactions on Knowledge and Data Engineering
A Unifeid Bias-Variance Decomposition and its Applications
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Bias-Variance Analysis and Ensembles of SVM
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Decision Tree Instability and Active Learning
ECML '07 Proceedings of the 18th European conference on Machine Learning
On Feature Selection, Bias-Variance, and Bagging
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
A MapReduce-based distributed SVM ensemble for scalable image classification and annotation
Computers & Mathematics with Applications
Hi-index | 0.00 |
Bias variance decomposition for classifiers is a useful tool in understanding classifier behavior. Unfortunately, the literature does not provide consistent guidelines on how to apply a bias variance decomposition. This paper examines the various parameters and variants of empirical bias variance decompositions through an extensive simulation study. Based on this study, we recommend to use ten fold cross validation as sampling method and take 100 samples within each fold with a test set size of at least 2000. Only if the learning algorithm is stable, fewer samples, a smaller test set size or lower number of folds may be justified.