CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Constrained Test Problems for Multi-objective Evolutionary Optimization
EMO '01 Proceedings of the First International Conference on Evolutionary Multi-Criterion Optimization
Radial Basis Functions
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Meta-Modeling in Multiobjective Optimization
Multiobjective Optimization
Performance assessment of multiobjective optimizers: an analysis and review
IEEE Transactions on Evolutionary Computation
LION'05 Proceedings of the 5th international conference on Learning and Intelligent Optimization
Hi-index | 0.00 |
Metamodels can speed up the optimization process. Previously evaluated designs can be used as a training set for building surrogate models. Subsequently an inexpensive virtual optimization can be performed. Candidate solutions found in this way need to be validated (evaluated by means of the real solver). This process can be iterated in an automatic way: this is the reason of the fast optimization algorithms. At each iteration the newly evaluated designs enrich the training database, permitting more and more accurate metamodels to be build in an adaptive way. In this paper a novel scheme for fast optimizers is introduced: the virtual optimization - representing an exploitation process - is accompanied by a virtual run of a suited space-filler algorithm - for exploration purposes - increasing the robustness of the fast optimizer.