Algorithms for clustering data
Algorithms for clustering data
The Strength of Weak Learnability
Machine Learning
Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
Advanced Population Diversity Measures in Genetic Programming
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
A Survey And Analysis Of Diversity Measures In Genetic Programming
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Maintaining the Diversity of Genetic Programs
EuroGP '02 Proceedings of the 5th European Conference on Genetic Programming
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Ensemble techniques for parallel genetic programming based classifiers
EuroGP'03 Proceedings of the 6th European conference on Genetic programming
Parallelism and evolutionary algorithms
IEEE Transactions on Evolutionary Computation
A scalable cellular implementation of parallel genetic programming
IEEE Transactions on Evolutionary Computation
Inducing oblique decision trees with evolutionary algorithms
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
A boosting algorithm based on cellular genetic programming to build an ensemble of predictors is proposed. The method evolves a population of trees for a fixed number of rounds and, after each round, it chooses the predictors to include into the ensemble by applying a clustering algorithm to the population of classifiers. The method proposed runs on a distributed hybrid multi-island environment that combines the island and cellular models of parallel genetic programming. The large amount of memory required to store the ensemble makes the method heavy to deploy. The paper shows that by applying suitable pruning strategies it is possible to select a subset of the classifiers without increasing misclassification errors; indeed, up to 20 of pruning, ensemble accuracy increases. Experiments on several data sets show that combining clustering and pruning enhances classification accuracy of the ensemble approach.