Neural Computation
Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms
Factor analysis using delta-rule wake-sleep learning
Neural Computation
Bayesian Methods for Efficient Genetic Programming
Genetic Programming and Evolvable Machines
Schemata, Distributions and Graphical Models in Evolutionary Optimization
Journal of Heuristics
Using Optimal Dependency-Trees for Combinational Optimization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
From Recombination of Genes to the Estimation of Distributions I. Binary Parameters
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
Removing the Genetics from the Standard Genetic Algorithm
Removing the Genetics from the Standard Genetic Algorithm
Fda -a scalable evolutionary algorithm for the optimization of additively decomposed functions
Evolutionary Computation
Hi-index | 0.00 |
Recently, several evolutionary algorithms have been proposed that build and use an explicit distribution model of the population to perform optimization. One of the main issues in this class of algorithms is how to estimate the distribution of selected samples. In this paper, we present a Bayesian evolutionary algorithm (BEA) that learns the sample distribution by a probabilistic graphical model known as Helmholtz machines. Due to the generative nature and availability of the wake-sleep learning algorithm, the Helmholtz machines provide an effective tool for modeling and sampling from the distribution of selected individuals. The proposed method has been applied to a suite of GA-deceptive functions. Experimental results show that the BEA with the Helmholtz machine outperforms the simple genetic algorithm.