Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Multiobjective evolutionary algorithm test suites
Proceedings of the 1999 ACM symposium on Applied computing
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
An Investigation of Niche and Species Formation in Genetic Function Optimization
Proceedings of the 3rd International Conference on Genetic Algorithms
A Nonparametric Approach to Noisy and Costly Optimization
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Bayesian Optimization Algorithms for Multi-objective Optimization
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Fitness Inheritance In Multi-objective Optimization
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
A comprehensive survey of fitness approximation in evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
Meta-Modeling in Multiobjective Optimization
Multiobjective Optimization
Is fitness inheritance useful for real-world applications?
EMO'03 Proceedings of the 2nd international conference on Evolutionary multi-criterion optimization
Multiobjective optimization on a budget of 250 evaluations
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
Autonomous evolution of dynamic gaits with two quadruped robots
IEEE Transactions on Robotics
Performance assessment of multiobjective optimizers: an analysis and review
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
Hi-index | 0.01 |
We consider methods for noisy multiobjective optimization, specifically methods for approximating a true underlying Pareto front when function evaluations are corrupted by Gaussian measurement noise on the objective function values. We focus on the scenario of a limited budget of function evaluations (100 and 250), where previously it was found that an iterative optimization method -- ParEGO -- based on surrogate modeling of the multiobjective fitness landscape was very effective in the non-noisy case. Our investigation here measures how ParEGO degrades with increasing noise levels. Meanwhile we introduce a new method that we propose for limited-budget and noisy scenarios: TOMO, deriving from the single-objective PB1 algorithm, which iteratively seeks the basins of optima using nonparametric statistical testing over previously visited points. We find ParEGO tends to outperform TOMO, and both (but especially ParEGO), are quite robust to noise. TOMO is comparable and perhaps edges ParEGO in the case of budgets of 100 evaluations with low noise. Both usually beat our suite of five baseline comparisons.