Lipschitzian optimization without the Lipschitz constant
Journal of Optimization Theory and Applications
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
A Radial Basis Function Method for Global Optimization
Journal of Global Optimization
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
Computer experiments and global optimization
Computer experiments and global optimization
Journal of Global Optimization
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Improved Strategies for Radial basis Function Methods for Global Optimization
Journal of Global Optimization
A review of recent advances in global optimization
Journal of Global Optimization
Automatic gait optimization with Gaussian process regression
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Practical bayesian optimization
Practical bayesian optimization
Performance scaling of multi-objective evolutionary algorithms
EMO'03 Proceedings of the 2nd international conference on Evolutionary multi-criterion optimization
Bayesian optimization using sequential monte carlo
LION'12 Proceedings of the 6th international conference on Learning and Intelligent Optimization
Variable risk control via stochastic optimization
International Journal of Robotics Research
Bayesian optimization in high dimensions via random embeddings
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
Response surface methods, and global optimization techniques in general, are typically evaluated using a small number of standard synthetic test problems, in the hope that these are a good surrogate for real-world problems. We introduce a new, more rigorous methodology for evaluating global optimization techniques that is based on generating thousands of test functions and then evaluating algorithm performance on each one. The test functions are generated by sampling from a Gaussian process, which allows us to create a set of test functions that are interesting and diverse. They will have different numbers of modes, different maxima, etc., and yet they will be similar to each other in overall structure and level of difficulty. This approach allows for a much richer empirical evaluation of methods that is capable of revealing insights that would not be gained using a small set of test functions. To facilitate the development of large empirical studies for evaluating response surface methods, we introduce a dimension-independent measure of average test problem difficulty, and we introduce acquisition criteria that are invariant to vertical shifting and scaling of the objective function. We also use our experimental methodology to conduct a large empirical study of response surface methods. We investigate the influence of three properties--parameter estimation, exploration level, and gradient information--on the performance of response surface methods.