Evolution and Optimum Seeking: The Sixth Generation
Evolution and Optimum Seeking: The Sixth Generation
Evolution strategies –A comprehensive introduction
Natural Computing: an international journal
On classes of functions for which No Free Lunch results hold
Information Processing Letters
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Differential Evolution: In Search of Solutions (Springer Optimization and Its Applications)
Differential Evolution: In Search of Solutions (Springer Optimization and Its Applications)
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Comparison-based algorithms are robust and randomized algorithms are anytime
Evolutionary Computation
Adaptive Encoding: How to Render Search Coordinate System Invariant
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
The particle swarm - explosion, stability, and convergence in amultidimensional complex space
IEEE Transactions on Evolutionary Computation
A (1+1)-CMA-ES for constrained optimisation
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Information Sciences: an International Journal
A comparative analysis of FSS with CMA-ES and S-PSO in ill-conditioned problems
IDEAL'12 Proceedings of the 13th international conference on Intelligent Data Engineering and Automated Learning
Function optimization using cartesian genetic programming
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
On the equivalences and differences of evolutionary algorithms
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
Abstract: This paper investigates the behavior of PSO (particle swarm optimization) and CMA-ES (covariance matrix adaptation evolution strategy) on ill-conditioned functions. The paper also highlights momentum as important common concept used in both algorithms and reviews important invariance properties. On separable, ill-conditioned functions, PSO performs very well and outperforms CMA-ES by a factor of up to five. On the same but rotated functions, the performance of CMA-ES is unchanged, while the performance of PSO declines dramatically: on non-separable, ill-conditioned functions we find the search costs (number of function evaluations) of PSO increasing roughly proportional with the condition number and CMA-ES outperforms PSO by orders of magnitude. The strong dependency of PSO on rotations originates from random events that are only independent within the given coordinate system. The CMA-ES adapts the coordinate system where the independent events take place and is rotational invariant. We argue that invariance properties, like rotational invariance, are desirable, because they increase the predictive power of performance results by inducing problem equivalence classes.