Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
From data mining to knowledge discovery: an overview
Advances in knowledge discovery and data mining
Inductive Genetic Programming with Decision Trees
ECML '97 Proceedings of the 9th European Conference on Machine Learning
Genetic Programming and Simulated Annealing: A Hybrid Method to Evolve Decision Trees
Proceedings of the European Conference on Genetic Programming
CAGE: A Tool for Parallel Genetic Programming Applications
EuroGP '01 Proceedings of the 4th European Conference on Genetic Programming
Parallel Genetic Programming for Decision Tree Induction
ICTAI '01 Proceedings of the 13th IEEE International Conference on Tools with Artificial Intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Improving cooperative GP ensemble with clustering and pruning for pattern classification
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Ensemble learning for free with evolutionary algorithms?
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Evolution of Multiple Tree Structured Patterns from Tree-Structured Data Using Clustering
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
A self-organized, distributed, and adaptive rule-based induction system
IEEE Transactions on Neural Networks
Training binary GP classifiers efficiently: a Pareto-coevolutionary approach
EuroGP'07 Proceedings of the 10th European conference on Genetic programming
Expert Systems with Applications: An International Journal
A fuzzy evolutionary framework for combining ensembles
Applied Soft Computing
Hi-index | 0.00 |
An extension of Cellular Genetic Programming for data classification to induce an ensemble of predictors is presented. Each classifier is trained on a different subset of the overall data, then they are combined to classify new tuples by applying a simple majority voting algorithm, like bagging. Preliminary results on a large data set show that the ensemble of classifiers trained on a sample of the data obtains higher accuracy than a single classifier that uses the entire data set at a much lower computational cost.