Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Statistics and Computing
Mirrored sampling and sequential selection for evolution strategies
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
Gaussian adaptation revisited: an entropic view on covariance matrix adaptation
EvoApplicatons'10 Proceedings of the 2010 international conference on Applications of Evolutionary Computation - Volume Part I
Rigorous runtime analysis of the (1+1) ES: 1/5-rule and ellipsoidal fitness landscapes
FOGA'05 Proceedings of the 8th international conference on Foundations of Genetic Algorithms
Hi-index | 0.00 |
We evaluate the performance of several gradient-free variable-metric continuous optimization schemes on a specific set of quadratic functions. We revisit a randomized Hessian approximation scheme (D. Leventhal and A. S. Lewis. Randomized Hessian estimation and directional search, 2011), discuss its theoretical underpinnings, and introduce a novel, numerically stable implementation of the scheme (RH). For comparison we also consider closely related Covariance Matrix Adaptation (CMA) schemes. A key goal of this study is to elucidate the influence of the distribution of eigenvalues of quadratic functions on the convergence properties of the different variable-metric schemes. For this purpose we introduce a class of quadratic functions with parameterizable spectra. Our empirical study shows that (i) the performance of RH methods is less dependent on the spectral distribution than CMA schemes, (ii) that adaptive step size control is more efficient in the RH method than line search, and (iii) that the concept of the evolution path allows a paramount speed-up of CMA schemes on quadratic functions but does not alleviate the overall dependence on the eigenvalue spectrum. The present results may trigger research into the design of novel CMA update schemes with improved spectral invariance.