Journal of Global Optimization
Evaluating Multi-criteria Evolutionary Algorithms for Airfoil Optimisation
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Experimental Research in Evolutionary Computation: The New Experimentalism (Natural Computing Series)
Comparing parameter tuning methods for evolutionary algorithms
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Tuning optimization algorithms for real-world problems by means of surrogate modeling
Proceedings of the 12th annual conference on Genetic and evolutionary computation
An EMO algorithm using the hypervolume measure as selection criterion
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels
IEEE Transactions on Evolutionary Computation
Automatic configuration of state-of-the-art multi-objective optimizers using the TP+PLS framework
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Calibrating continuous multi-objective heuristics using mixture experiments
Journal of Heuristics
Effect of SMS-EMOA parameterizations on hypervolume decreases
LION'12 Proceedings of the 6th international conference on Learning and Intelligent Optimization
Hi-index | 0.00 |
Typically, the variation operators deployed in evolutionary multiobjective optimization algorithms (EMOA) are either simulated binary crossover with polynomial mutation or differential evolution operators. This empirical study aims at the development of a sound method how to assess which of these variation operators perform best in the multiobjective context. In case of the S-metric selection EMOA our main findings are: (1) The performance of the tuned operators improved significantly compared to the default parameterizations. (2) The performance of the two tuned variation operators is very similar. (3) The optimized parameter configurations for the considered problems are very different.