Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensembling neural networks: many could be better than all
Artificial Intelligence
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Clustering ensembles of neural network models
Neural Networks
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Ensemble Pruning Via Semi-definite Programming
The Journal of Machine Learning Research
Engineering multiversion neural-net systems
Neural Computation
An Analysis of Ensemble Pruning Techniques Based on Ordered Aggregation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selective ensemble of decision trees
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
Support Vectors Selection for Supervised Learning Using an Ensemble Approach
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Ensemble selection for superparent-one-dependence estimators
AI'05 Proceedings of the 18th Australian Joint conference on Advances in Artificial Intelligence
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.10 |
Ensemble methods have been successfully used as a classification scheme. The reduction of the complexity of this popular learning paradigm motivated the appearance of ensemble pruning algorithms. This paper presents a new ensemble pruning method which highly reduces the complexity of ensemble methods and performs better than complete bagging in terms of classification accuracy. More importantly, it is a very fast algorithm. It consists in ordering all base classifiers with respect to a new criterion which exploits an unsupervised ensemble margin. This method highlights the major influence of low margin instances on the performance of the pruning task and, more generally, the potential of low margin instances for the design of better ensembles. Comparison to both the naive approach of randomly pruning base classifiers and another ordering-based pruning algorithm is carried out in an extensive empirical analysis.