SIAM Journal on Control and Optimization
The Simple Genetic Algorithm: Foundations and Theory
The Simple Genetic Algorithm: Foundations and Theory
Noisy Local Optimization with Evolution Strategies
Noisy Local Optimization with Evolution Strategies
Genetic algorithms, selection schemes, and the varying effects of noise
Evolutionary Computation
On the robustness of population-based versus point-basedoptimization in the presence of noise
IEEE Transactions on Evolutionary Computation
A new model of simulated evolutionary computation-convergenceanalysis and specifications
IEEE Transactions on Evolutionary Computation
Evolutionary optimization in uncertain environments-a survey
IEEE Transactions on Evolutionary Computation
Markov chain analysis of genetic algorithms in a wide variety of noisy environments
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Evolutionary Model Type Selection for Global Surrogate Modeling
The Journal of Machine Learning Research
Computational Optimization and Applications
Hi-index | 0.00 |
In this study, we take a first step towards theoretically analyzing genetic algorithms (GAs) in noisy environments using Markov chain theory. We explicitly construct a Markov chain that models GAs applied to fitness functions perturbed by either additive or multiplicative noise that takes on finitely many values, and we analyze the chain to investigate the transition and convergence properties of the GAs. For the additive case, our analysis shows that GAs eventually (i.e., as the number of iterations goes to infinity) find at least one globally optimal solution with probability 1. In contrast, GAs may eventually with probability 1 fail to do so in the multiplicative case, and we establish a condition that is both necessary and sufficient for eventually finding a globally optimal solution. In addition, our analysis shows that the chain has a stationary distribution that is also its steady-state distribution. Based on this property, we derive an upper bound for the number of iterations sufficient to ensure with certain probability that a GA has reached the set of globally optimal solutions and continues to include in each subsequent population at least one globally optimal solution whose observed fitness value is greater than that of any suboptimal solution.