Neural networks and the bias/variance dilemma
Neural Computation
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
IEEE Transactions on Pattern Analysis and Machine Intelligence
Variance and Bias for General Loss Functions
Machine Learning
Inference for the Generalization Error
Machine Learning
An introduction to boosting and leveraging
Advanced lectures on machine learning
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Comparison of Decision Tree Ensemble Creation Techniques
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multi-Class Learning by Smoothed Boosting
Machine Learning
Bias and variance of rotation-based ensembles
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Input Decimated Ensemble based on Neighborhood Preserving Embedding for spectrogram classification
Expert Systems with Applications: An International Journal
A penalized likelihood based pattern classification algorithm
Pattern Recognition
Ensemble classification based on generalized additive models
Computational Statistics & Data Analysis
Creating ensembles of classifiers via fuzzy clustering and deflection
Fuzzy Sets and Systems
Reduced Reward-punishment editing for building ensembles of classifiers
Expert Systems with Applications: An International Journal
Greedy optimization classifiers ensemble based on diversity
Pattern Recognition
An empirical evaluation of rotation-based ensemble classifiers for customer churn prediction
Expert Systems with Applications: An International Journal
Pattern classification using a penalized likelihood method
ANNPR'10 Proceedings of the 4th IAPR TC3 conference on Artificial Neural Networks in Pattern Recognition
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part I
MLDM'12 Proceedings of the 8th international conference on Machine Learning and Data Mining in Pattern Recognition
ICA3PP'12 Proceedings of the 12th international conference on Algorithms and Architectures for Parallel Processing - Volume Part II
A novel approach to protein structure prediction using PCA or LDA based extreme learning machines
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part IV
Hi-index | 0.10 |
This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower prediction error than either Rotation Forest or AdaBoost more often than the reverse. Meanwhile, RotBoost is found to perform much better than Bagging and MultiBoost. Through employing the bias and variance decompositions of error to gain more insight of the considered classification methods, RotBoost is seen to simultaneously reduce the bias and variance terms of a single tree and the decrement achieved by it is much greater than that done by the other ensemble methods, which leads RotBoost to perform best among the considered classification procedures. Furthermore, RotBoost has a potential advantage over AdaBoost of suiting parallel execution.