Natural evolution and collective optimum-seeking
Computational systems analysis
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Metamodel-Assisted Evolution Strategies
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Fitness Approximation In Evolutionary Computation - a Survey
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
A framework for evolutionary optimization with approximate fitnessfunctions
IEEE Transactions on Evolutionary Computation
Intensive surrogate model exploitation in self-adaptive surrogate-assisted cma-es (saacm-es)
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
The task of speeding up the optimization process on problems with very time consuming fitness functions is a central point in evolutionary computation. Applying models as a surrogate of the real fitness function is a quite popular idea. The performance of this approach is highly dependent on tile frequency of how often the model is updated with data from new fitness evaluations. However, in generation based algorithms this is only done every λ-th fitness evaluation. To overcome this problem we use a steady-state strategy, which updates the model immediately after each fitness evaluation. We present a new model assisted steay-state Evolution Strategy (ES), which uses Radial-Basis-Function networks as a model. To support self-adaption in the steady-state algorithm a median selection scheme is applied. The convergence behavior of the new algorithm is examined with numerical results from extensive simulations on several high dimensional test functions. It achieves better results than standard ES, steady-state ES or model assisted ES.