Semi-naive Bayesian classifier
EWSL-91 Proceedings of the European working session on learning on Machine learning
Machine Learning - Special issue on learning with probabilistic representations
LEARNABLE EVOLUTION MODEL: Evolutionary Processes Guided by Machine Learning
Machine Learning - Special issue on multistrategy learning
Evolution and Optimum Seeking: The Sixth Generation
Evolution and Optimum Seeking: The Sixth Generation
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Schemata, Distributions and Graphical Models in Evolutionary Optimization
Journal of Heuristics
From Recombination of Genes to the Estimation of Distributions I. Binary Parameters
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
Towards a New Evolutionary Computation: Advances on Estimation of Distribution Algorithms (Studies in Fuzziness and Soft Computing)
Wise breeding GA via machine learning techniques for function optimization
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
Connection Science - Evolutionary Learning and Optimisation
Multi-objective optimization with joint probabilistic modeling of objectives and variables
EMO'11 Proceedings of the 6th international conference on Evolutionary multi-criterion optimization
A review on probabilistic graphical models in evolutionary computation
Journal of Heuristics
Hi-index | 0.00 |
In this work, we present a generalisation to continuous domains of an optimization method based on evolutionary computation that applies Bayesian classifiers in the learning process. The main difference between other estimation of distribution algorithms (EDAs) and this new method –known as Evolutionary Bayesian Classifier-based Optimization Algorithms (EBCOAs)– is the way the fitness function is taken into account, as a new variable, to generate the probabilistic graphical model that will be applied for sampling the next population. We also present experimental results to compare performance of this new method with other methods of the evolutionary computation field like evolution strategies, and EDAs. Results obtained show that this new approach can at least obtain similar performance as these other paradigms.