Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Ensemble pruning via individual contribution ordering
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Sparse ensembles using weighted combination methods based on linear programming
Pattern Recognition
Inference on the prediction of ensembles of infinite size
Pattern Recognition
Bagging-based spectral clustering ensemble selection
Pattern Recognition Letters
A probabilistic model of classifier competence for dynamic ensemble selection
Pattern Recognition
Adaptive ROC-based ensembles of HMMs applied to anomaly detection
Pattern Recognition
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
A new metric for greedy ensemble pruning
AICI'11 Proceedings of the Third international conference on Artificial intelligence and computational intelligence - Volume Part II
Margin distribution based bagging pruning
Neurocomputing
A double pruning algorithm for classification ensembles
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Expert pruning based on genetic algorithm in regression problems
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part III
Energy-Based metric for ensemble selection
APWeb'12 Proceedings of the 14th Asia-Pacific international conference on Web Technologies and Applications
Margin optimization based pruning for random forest
Neurocomputing
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Diversity regularized ensemble pruning
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Classifier ensemble using a heuristic learning with sparsity and diversity
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
How large should ensembles of classifiers be?
Pattern Recognition
Predicting shellfish farm closures with class balancing methods
AI'12 Proceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence
Expert Systems with Applications: An International Journal
An ensemble of decision cluster crotches for classification of high dimensional data
Knowledge-Based Systems
Margin-based ordered aggregation for ensemble pruning
Pattern Recognition Letters
Training efficient tree-based models for document ranking
ECIR'13 Proceedings of the 35th European conference on Advances in Information Retrieval
Effect of ensemble classifier composition on offline cursive character recognition
Information Processing and Management: an International Journal
Classifier ensemble for an effective cytological image analysis
Pattern Recognition Letters
A survey of multiple classifier systems as hybrid systems
Information Fusion
Using Bayesian networks for selecting classifiers in GP ensembles
Information Sciences: an International Journal
Hi-index | 0.15 |
Several pruning strategies that can be used to reduce the size and increase the accuracy of bagging ensembles are analyzed. These heuristics select subsets of complementary classifiers that, when combined, can perform better than the whole ensemble. The pruning methods investigated are based on modifying the order of aggregation of classifiers in the ensemble. In the original bagging algorithm, the order of aggregation is left unspecified. When this order is random, the generalization error typically decreases as the number of classifiers in the ensemble increases. If an appropriate ordering for the aggregation process is devised, the generalization error reaches a minimum at intermediate numbers of classifiers. This minimum lies below the asymptotic error of bagging. Pruned ensembles are obtained by retaining a fraction of the classifiers in the ordered ensemble. The performance of these pruned ensembles is evaluated in several benchmark classification tasks under different training conditions. The results of this empirical investigation show that ordered aggregation can be used for the efficient generation of pruned ensembles that are competitive, in terms of performance and robustness of classification, with computationally more costly methods that directly select optimal or near-optimal subensembles.