Multiple comparison procedures
Multiple comparison procedures
Instance-Based Learning Algorithms
Machine Learning
Original Contribution: Stacked generalization
Neural Networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Decision Combination in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning for the Detection of Oil Spills in Satellite Radar Images
Machine Learning - Special issue on applications of machine learning and the knowledge discovery process
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
ECML '95 Proceedings of the 8th European Conference on Machine Learning
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Meta-Learning by Landmarking Various Learning Algorithms
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Adaptive Selection of Image Classifiers
ICIAP '97 Proceedings of the 9th International Conference on Image Analysis and Processing-Volume I - Volume I
Issues in stacked generalization
Journal of Artificial Intelligence Research
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
On diversity and accuracy of homogeneous and heterogeneous ensembles
International Journal of Hybrid Intelligent Systems
Random k-Labelsets: An Ensemble Method for Multilabel Classification
ECML '07 Proceedings of the 18th European conference on Machine Learning
Focused Ensemble Selection: A Diversity-Based Method for Greedy Ensemble Selection
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
A fast ensemble pruning algorithm based on pattern mining process
Data Mining and Knowledge Discovery
Collaborative multi-agent rock facies classification from wireline well log data
Engineering Applications of Artificial Intelligence
Sparse ensembles using weighted combination methods based on linear programming
Pattern Recognition
A Note on a priori Estimations of Classification Circuit Complexity
Fundamenta Informaticae - Hardest Boolean Functions and O.B. Lupanov
Greedy optimization classifiers ensemble based on diversity
Pattern Recognition
Protein classification with multiple algorithms
PCI'05 Proceedings of the 10th Panhellenic conference on Advances in Informatics
Training regression ensembles by sequential target correction and resampling
Information Sciences: an International Journal
Subglacial water presence classification from polar radar data
Engineering Applications of Artificial Intelligence
A competitive ensemble pruning approach based on cross-validation technique
Knowledge-Based Systems
The use of artificial-intelligence-based ensembles for intrusion detection: a review
Applied Computational Intelligence and Soft Computing
Using Bayesian networks for selecting classifiers in GP ensembles
Information Sciences: an International Journal
Hi-index | 0.00 |
There are two main paradigms in combining different classification algorithms: Classifier Selection and Classifier Fusion. The first one selects a single model for classifying a new instance, while the latter combines the decisions of all models. The work presented in this paper stands in between these two paradigms aiming to tackle the disadvantages and benefit from the advantages of both. In particular, this paper proposes the use of statistical procedures for the selection of the best subgroup among different classification algorithms and the subsequent fusion of the decision of the models in this subgroup with simple methods like Weighted Voting. Extensive experimental results show that the proposed approach, Selective Fusion, improves over simple selection and fusion methods, leading to performance comparable with the state-of-the-art heterogeneous classifier combination method of Stacking, without the additional computational cost and learning problems of meta-training.