Synergy of clustering multiple back propagation networks
Advances in neural information processing systems 2
Original Contribution: Stacked generalization
Neural Networks
Combining the results of several neural network classifiers
Neural Networks
Machine Learning
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensembles as a sequence of classifiers
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
An empirical evaluation of bagging and boosting
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
iBoost: Boosting Using an i nstance-Based Exponential Weighting Scheme
ECML '02 Proceedings of the 13th European Conference on Machine Learning
The Synergy Between PAV and AdaBoost
Machine Learning
iBoost: Boosting using an instance-based exponential weighting scheme
International Journal of Hybrid Intelligent Systems
Managing domain knowledge and multiple models with boosting
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Improving boosting by exploiting former assumptions
MCD'07 Proceedings of the 3rd ECML/PKDD international conference on Mining complex data
An empirical study of the convergence of regionboost
ICIC'09 Proceedings of the Intelligent computing 5th international conference on Emerging intelligent computing technology and applications
Local decision bagging of binary neural classifiers
Canadian AI'08 Proceedings of the Canadian Society for computational studies of intelligence, 21st conference on Advances in artificial intelligence
The effect of distance metrics on boosting with dynamic weighting schemes
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 1
Hi-index | 0.00 |
This paper presents a new algorithm for Boosting the performance of an ensemble of classifiers. In Boosting, a series of classifiers is used to predict the class of data where later members of the series concentrate on training data that is incorrectly predicted by earlier members. To make a prediction about a new pattern, each classifier predicts the class of the pattern and these predictions are then combined. In standard Boosting, the predictions are combined by weighting the predictions by a term related to the accuracy of the classifier on the training data. This approach ignores the fact that later classifiers focus on small subsets of the patterns and thus may only be good at classifying similar patterns. In RegionBoost, this problem is addressed by weighting each classifier's predictions by a factor measuring how well that classifier performs on similar patterns. In this paper we examine several methods for determining how well a classifier performs on similar patterns. Empirical tests indicate RegionBoost produces gains in performance for some data sets and has little effect on others.