Co-evolving parasites improve simulated evolution as an optimization procedure
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Learning classifier systems: a complete introduction, review, and roadmap
Journal of Artificial Evolution and Applications
Genetic Programming, Ensemble Methods and the Bias/Variance Tradeoff - Introductory Investigations
Proceedings of the European Conference on Genetic Programming
Applying Boosting Techniques to Genetic Programming
Selected Papers from the 5th European Conference on Artificial Evolution
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Dynamics of AdaBoost: Cyclic Behavior and Convergence of Margins
The Journal of Machine Learning Research
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Monte Carlo theory as an explanation of bagging and boosting
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Ensemble techniques for parallel genetic programming based classifiers
EuroGP'03 Proceedings of the 6th European conference on Genetic programming
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Training genetic programming on half a million patterns: an example from anomaly detection
IEEE Transactions on Evolutionary Computation
Particle swarm optimization based multi-prototype ensembles
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Symbiotic coevolutionary genetic programming: a benchmarking study under large attribute spaces
Genetic Programming and Evolvable Machines
Pruning GP-Based classifier ensembles by bayesian networks
PPSN'12 Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part I
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Evolutionary computation for supervised learning
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Using Bayesian networks for selecting classifiers in GP ensembles
Information Sciences: an International Journal
Hi-index | 0.00 |
Evolutionary Learning proceeds by evolving a population of classifiers, from which it generally returns (with some notable exceptions) the single best-of-run classifier as final result. In the meanwhile, Ensemble Learning, one of the most efficient approaches in supervised Machine Learning for the last decade, proceeds by building a population of diverse classifiers. Ensemble Learning with Evolutionary Computation thus receives increasing attention. The Evolutionary Ensemble Learning (EEL) approach presented in this paper features two contributions. First, a new fitness function, inspired by co-evolution and enforcing the classifier diversity, is presented. Further, a new selection criterion based on the classification margin is proposed. This criterion is used to extract the classifier ensemble from the final population only (Off-EEL) or incrementally along evolution (On-EEL). Experiments on a set of benchmark problems show that Off-EEL outperforms single-hypothesis evolutionary learning and state-of-art Boosting and generates smaller classifier ensembles.