Fuzzy sets, uncertainty, and information
Fuzzy sets, uncertainty, and information
A note on genetic algorithms for large-scale feature selection
Pattern Recognition Letters
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ensembling neural networks: many could be better than all
Artificial Intelligence
Using Correspondence Analysis to Combine Classifiers
Machine Learning
Combining Classifiers with Meta Decision Trees
Machine Learning
Sum Versus Vote Fusion in Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Reduction of the Boasting Bias of Linear Experts
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Experts' Boasting in Trainable Fusion Rules
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine learning for HIV-1 protease cleavage site prediction
Pattern Recognition Letters
An Experimental Study on Pedestrian Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Ensemble based on GA wrapper feature selection
Computers and Industrial Engineering
Issues in stacked generalization
Journal of Artificial Intelligence Research
Genetic algorithms in classifier fusion
Applied Soft Computing
Assessing the value of a candidate: comparing belief function and possibility theories
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
Ensembling local learners ThroughMultimodal perturbation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Strategic group identification using evolutionary computation
Expert Systems with Applications: An International Journal
CLICOM: Cliques for combining multiple clusterings
Expert Systems with Applications: An International Journal
Linear classifier combination and selection using group sparse regularization and hinge loss
Pattern Recognition Letters
An efficient and scalable family of algorithms for combining clusterings
Engineering Applications of Artificial Intelligence
Hi-index | 12.06 |
Several studies have reported that the ensemble of classifiers can improve the performance of a stand-alone classifier. In this paper, we propose a learning method for combining the predictions of a set of classifiers. The method described in this paper uses a genetic-based version of the correspondence analysis for combining classifiers. The correspondence analysis is based on the orthonormal representation of the labels assigned to the patterns by a pool of classifiers. In this paper instead of the orthonormal representation we use a pool of representations obtained by a genetic algorithm. Each single representation is used to train a different classifiers, these classifiers are combined by vote rule. The performance improvement with respect to other learning-based fusion methods is validated through experiments with several benchmark datasets.