An introduction to differential evolution
New ideas in optimization
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
A comprehensive survey of fitness approximation in evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Faster convergence by means of fitness estimation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
How to Solve It: Modern Heuristics
How to Solve It: Modern Heuristics
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Generally, evolutionary algorithms require a large number of evaluations of the objective function in order to obtain a good solution. This paper presents a simple approach to save evaluations, applied to a competitive differential evolution algorithm used to solve constrained optimization problems. The idea is based on the way in which differential evolution finds new promising areas of the search space. This allows to randomly assign a zero fitness to some offspring newly generated in order to avoid its evaluation and, as a secondary effect, to slow down convergence. The approach is tested using different percentages of individuals from the population, providing a competitive performance. Besides, the effect that the elimination of individuals has on convergence is also analyzed. Finally, to remark behavior differences, the approach is tested against a version with a smaller population and against a version with a simple fitness approximation method. The results obtained are discussed and some conclusions are drawn.