Neural networks and the bias/variance dilemma
Neural Computation
Machine Learning
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Variance and Bias for General Loss Functions
Machine Learning
A Unifeid Bias-Variance Decomposition and its Applications
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Machine Learning
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
Stability of Randomized Learning Algorithms
The Journal of Machine Learning Research
Combining bias and variance reduction techniques for regression trees
ECML'05 Proceedings of the 16th European conference on Machine Learning
An experimental bias-variance analysis of SVM ensembles based on resampling techniques
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
There have been large attempts to adopt the bias-variance framework from the regression problems to the classification problems. However, recently, it has been shown that only non-straightforward extensions exist for classification problems. In this paper, we present an alternative visualization framework for classification problems called zone analysis. Our zone analysis framework partly extends the bias-variance idea; instead of decomposing an error into two parts, i.e. the biased and unbiased components, our framework decomposes the error into K components. While bias-variance information is still contained in our framework, our framework provides interesting observations which are not obviously seen in the previous bias-variance framework, e.g. a prejudice behavior of the bagging algorithm to various unbiased instances. Our framework is suitable for visualizing an effect of context changes on learning performance. The type of context changes which we primarily investigate in the paper is "a change from a base learner to an ensemble learner such as bagging, adaboost, arc-x4 and multi-boosting".