A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
Application notes: memetic mission management
IEEE Computational Intelligence Magazine
Benchmarking CMA-EGS on the BBOB 2010 noiseless function testbed
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Benchmarking CMA-EGS on the BBOB 2010 noisy function testbed
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Mirrored sampling and sequential selection for evolution strategies
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
Investigating sequential patterns of DNS usage and its applications
ADMA'10 Proceedings of the 6th international conference on Advanced data mining and applications: Part I
Analyzing the impact of mirrored sampling and sequential selection in elitist evolution strategies
Proceedings of the 11th workshop proceedings on Foundations of genetic algorithms
Noisy optimization: a theoretical strategy comparison of ES, EGS, SPSA & IF on the noisy sphere
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Evolutionary gradient search (EGS) is an approach to optimization that combines features of gradient strategies with ideas from evolutionary computation. Recently, several modifications to the algorithm have been proposed with the goal of improving its robustness in the presence of noise and its suitability for implementation on parallel computers. In this paper, the value of the proposed modifications is studied analytically. A scaling law is derived that describes the performance of the algorithm on the noisy sphere model and allows comparing it with competing strategies. The comparisons yield insights into the interplay of mutation, multire combination, and selection. Then, the covariance matrix adaptation mechanism originally formulated for evolution strategies is adapted for use with EGS in order to make the algorithm competitive on objective functions with large condition numbers of their Hessians. The resulting strategy is evaluated experimentally on a number of convex quadratic test functions.