The nature of statistical learning theory
The nature of statistical learning theory
A tutorial on support vector regression
Statistics and Computing
A comprehensive survey of fitness approximation in evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Comparison of Multiobjective Evolutionary Algorithms: Empirical Results
Evolutionary Computation
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Covariance Matrix Adaptation for Multi-objective Optimization
Evolutionary Computation
Theory of the hypervolume indicator: optimal μ-distributions and the choice of the reference point
Proceedings of the tenth ACM SIGEVO workshop on Foundations of genetic algorithms
An EMO algorithm using the hypervolume measure as selection criterion
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels
IEEE Transactions on Evolutionary Computation
A pareto-compliant surrogate approach for multiobjective optimization
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Dominance-based pareto-surrogate for multi-objective optimization
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
Local meta-models for ASM-MOMA
ICIC'11 Proceedings of the 7th international conference on Intelligent Computing: bio-inspired computing and applications
Local meta-models for ASM-MOMA
ICIC'11 Proceedings of the 7th international conference on Advanced Intelligent Computing
Surrogate modeling in the evolutionary optimization of catalytic materials
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Multiobjectivization for classifier parameter tuning
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
Most surrogate approaches to multi-objective optimization build a surrogate model for each objective. These surrogates can be used inside a classical Evolutionary Multiobjective Optimization Algorithm (EMOA) in lieu of the actual objectives, without modifying the underlying EMOA; or to filter out points that the models predict to be uninteresting. In contrast, the proposed approach aims at building a global surrogate model defined on the decision space and tightly characterizing the current Pareto set and the dominated region, in order to speed up the evolution progress toward the true Pareto set. This surrogate model is specified by combining a One-class Support Vector Machine (SVMs) to characterize the dominated points, and a Regression SVM to clamp the Pareto front on a single value. The resulting surrogate model is then used within state-of-the-art EMOAs to pre-screen the individuals generated by application of standard variation operators. Empirical validation on classical MOO benchmark problems shows a significant reduction of the number of evaluations of the actual objective functions.