A framework for structural risk minimisation
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data mining: concepts and techniques
Data mining: concepts and techniques
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A Theoretical and Experimental Analysis of Linear Combiners for Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using diversity of errors for selecting members of a committee classifier
Pattern Recognition
Using boosting to prune bagging ensembles
Pattern Recognition Letters
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Ensemble Pruning Via Semi-definite Programming
The Journal of Machine Learning Research
EROS: Ensemble rough subspaces
Pattern Recognition
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Predictive Ensemble Pruning by Expectation Propagation
IEEE Transactions on Knowledge and Data Engineering
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
Locality sensitive discriminant analysis
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Boosting with pairwise constraints
Neurocomputing
Sparse ensembles using weighted combination methods based on linear programming
Pattern Recognition
Online multiple instance boosting for object detection
Neurocomputing
Leveraging k-NN for generic classification boosting
Neurocomputing
Pruning adaptive boosting ensembles by means of a genetic algorithm
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Parallel consensual neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
How to combine the outputs from base classifiers is a key issue in ensemble learning. This paper presents a dynamic classifier ensemble method termed as DCE-CC. It dynamically selects a subset of classifiers for test samples according to classification confidence. The weights of base classifiers are learned by optimization of margin distribution on the training set, and the ordered aggregation technique is exploited to estimate the size of an appropriate subset. We examine the proposed fusion method on some benchmark classification tasks, where the stable nearest-neighbor rule and the unstable C4.5 decision tree algorithm are used for generating base classifiers, respectively. Compared with some other multiple classifier fusion algorithms, the experimental results show the effectiveness of our approach. Then we explain the experimental results from the view point of margin distribution.