Nonstationary function optimization using genetic algorithm with dominance and diploidy
Proceedings of the Second International Conference on Genetic Algorithms on Genetic algorithms and their application
Global optimization
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
An introduction to genetic algorithms
An introduction to genetic algorithms
Evaluating evolutionary algorithms
Artificial Intelligence - Special volume on empirical methods
Evolutionary Optimization in Dynamic Environments
Evolutionary Optimization in Dynamic Environments
Optimal Mutation and Crossover Rates for a Genetic Algorithm Operating in a Dynamic Environment
EP '98 Proceedings of the 7th International Conference on Evolutionary Programming VII
An analysis of the behavior of a class of genetic adaptive systems.
An analysis of the behavior of a class of genetic adaptive systems.
Building Blocks, Cohort Genetic Algorithms, and Hyperplane-Defined Functions
Evolutionary Computation
EC'05 Proceedings of the 3rd European conference on Applications of Evolutionary Computing
GECCO '05 Proceedings of the 7th annual workshop on Genetic and evolutionary computation
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Proceedings of the 9th annual conference on Genetic and evolutionary computation
The impact of the mutation strategy on the quality of solution of parallel genetic algorithms
EC'08 Proceedings of the 9th WSEAS International Conference on Evolutionary Computing
EuroGP'06 Proceedings of the 2006 international conference on Applications of Evolutionary Computing
Hi-index | 0.00 |
Dynamic environments have periods of quiescence and periods of change. In periods of quiescence a Genetic Algorithm (GA) should (optimally) exploit good individuals while in periods of change the GA should (optimally) explore new solutions. Self-adaptation is a mechanism which allows individuals in the GA to choose their own mutation rate, and thus allows the GA to control when it explores new solutions or exploits old ones. We examine the use of this mechanism on a recently devised dynamic test suite, the Shaky Ladder Hyperplane-Defined Functions (sl-hdf's). This test suite can generate random problems with similar levels of difficulty and provides a platform allowing systematic controlled observations of the GA in dynamic environments. We show that in a variety of circumstances self-adaptation fails to allow the GA to perform better on this test suite than fixed mutation, even when the environment is static. We also show that mutation is beneficial throughout the run of a GA, and that seeding a population with known good genetic material is not always beneficial to the results. We provide explanations for these observations, with particular emphasis on comparing our results to other results [2] which have shown the GA to work in static environments. We conclude by giving suggestions as to how to change the simple GA to solve these problems.