The Strength of Weak Learnability
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
On the Accuracy of Meta-learning for Scalable Data Mining
Journal of Intelligent Information Systems
Advances in knowledge discovery and data mining
Advances in knowledge discovery and data mining
Combining classifiers using correspondence analysis
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Arbiter Meta-Learning with Dynamic Selection of Classifiers and Its Experimental Investigation
ADBIS '99 Proceedings of the Third East European Conference on Advances in Databases and Information Systems
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Advanced Dynamic Selection of Diagnostic Methods
CBMS '98 Proceedings of the Eleventh IEEE Symposium on Computer-Based Medical Systems
Learning Feature Selection for Medical Databases
CBMS '99 Proceedings of the 12th IEEE Symposium on Computer-Based Medical Systems
Data Mining using MLC++, A Machine Learning Library in C++
ICTAI '96 Proceedings of the 8th International Conference on Tools with Artificial Intelligence
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Local Feature Selection with Dynamic Integration of Classifiers
Fundamenta Informaticae - Intelligent Systems
Hi-index | 0.00 |
Decision committee learning has demonstrated spectacular success in reducing classification error from learned classifiers. These techniques develop a classifier in the form of a committee of subsidiary classifiers. The combination of outputs is usually performed by majority vote. Voting, however, has a shortcoming. It is unable to take into account local expertise. When a new instance is difficult to classify, then the average classifier will give a wrong prediction, and the majority vote will more probably result in a wrong prediction. Instead of voting, dynamic integration of classifiers can be used, which is based on the assumption that each committee member is best inside certain subareas of the whole feature space. In this paper, the proposed dynamic integration technique is evaluated with AdaBoost and Bagging, the decision committee approaches which have received extensive attention recently. The comparison results show that boosting and bagging have often significantly better accuracy with dynamic integration of classifiers than with simple voting.