Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Evolutionary Algorithms for Solving Multi-Objective Problems
Evolutionary Algorithms for Solving Multi-Objective Problems
A Taxonomy of Hybrid Metaheuristics
Journal of Heuristics
Multiobjective Optimization Using Evolutionary Algorithms - A Comparative Case Study
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Convex Optimization
Combining gradient techniques for numerical multi-objective evolutionary optimization
Proceedings of the 8th annual conference on Genetic and evolutionary computation
The measure of Pareto optima applications to multi-objective metaheuristics
EMO'03 Proceedings of the 2nd international conference on Evolutionary multi-criterion optimization
Pareto-, aggregation-, and indicator-based methods in many-objective optimization
EMO'07 Proceedings of the 4th international conference on Evolutionary multi-criterion optimization
EMO'07 Proceedings of the 4th international conference on Evolutionary multi-criterion optimization
Test problems based on Lamé superspheres
EMO'07 Proceedings of the 4th international conference on Evolutionary multi-criterion optimization
An EMO algorithm using the hypervolume measure as selection criterion
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
Comparing classical generating methods with an evolutionary multi-objective optimization method
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
Performance assessment of multiobjective optimizers: an analysis and review
IEEE Transactions on Evolutionary Computation
Theory of the hypervolume indicator: optimal μ-distributions and the choice of the reference point
Proceedings of the tenth ACM SIGEVO workshop on Foundations of genetic algorithms
EMO '09 Proceedings of the 5th International Conference on Evolutionary Multi-Criterion Optimization
On the hybridization of SMS-EMOA and local search for continuous multiobjective optimization
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
On the complexity of computing the hypervolume indicator
IEEE Transactions on Evolutionary Computation
Simulated evolution under multiple criteria conditions revisited
WCCI'08 Proceedings of the 2008 IEEE world conference on Computational intelligence: research frontiers
New challenges for memetic algorithms on continuous multi-objective problems
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Using gradient information for multi-objective problems in the evolutionary context
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part I
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
Hype: An algorithm for fast hypervolume-based many-objective optimization
Evolutionary Computation
Hypervolume-based multiobjective optimization: Theoretical foundations and practical implications
Theoretical Computer Science
Hi-index | 0.00 |
The problem of computing a good approximation set of the Pareto front of a multiobjective optimization problem can be recasted as the maximization of its S-metric value, which measures the dominated hypervolume. In this way, the S-metric has recently been applied in a variety of metaheuristics. In this work, a novel high-precision method for computing approximation sets of a Pareto front with maximal S-Metric is proposed as a high-level relay hybrid of an evolutionary algorithm and a gradient method, both guided by the S-metric. First, an evolutionary multiobjective optimizer moves the initial population close to the Pareto front. The gradient-based method takes this population as its starting point for computing a local maximal approximation set with respect to the S-metric. Thereby, the population is moved according to the gradient of the S-metric. This paper introduces expressions for computing the gradient of a set of points with respect to its S-metric on basis of the gradients of the objective functions. It discusses singularities where the gradient is vanishing or differentiability is one sided. To circumvent the problem of vanishing gradient components of the S-metric for dominated points in the population a penalty approach is introduced. In order to test the new hybrid algorithm, we compute the precise maximizer of the S-metric for a generalized Schaffer problem and show, empirically, that the relay hybrid strategy linearly converges to the precise optimum. In addition we provide first case studies of the hybrid method on complicated benchmark problems.