Elements of information theory
Elements of information theory
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
A Radial Basis Function Method for Global Optimization
Journal of Global Optimization
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
Computer experiments and global optimization
Computer experiments and global optimization
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
Comparison-based algorithms are robust and randomized algorithms are anytime
Evolutionary Computation
Journal of Global Optimization
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Hierarchical Knowledge Gradient for Sequential Sampling
The Journal of Machine Learning Research
Sequential design of computer experiments for the estimation of a probability of failure
Statistics and Computing
Entropy search for information-efficient global optimization
The Journal of Machine Learning Research
A rigorous runtime analysis for quasi-random restarts and decreasing stepsize
EA'11 Proceedings of the 10th international conference on Artificial Evolution
Worst-case global optimization of black-box functions through Kriging and relaxation
Journal of Global Optimization
Journal of Global Optimization
A benchmark of kriging-based infill criteria for noisy optimization
Structural and Multidisciplinary Optimization
Optimal learning for sequential sampling with non-parametric beliefs
Journal of Global Optimization
Hi-index | 0.00 |
In many global optimization problems motivated by engineering applications, the number of function evaluations is severely limited by time or cost. To ensure that each evaluation contributes to the localization of good candidates for the role of global minimizer, a sequential choice of evaluation points is usually carried out. In particular, when Kriging is used to interpolate past evaluations, the uncertainty associated with the lack of information on the function can be expressed and used to compute a number of criteria accounting for the interest of an additional evaluation at any given point. This paper introduces minimizers entropy as a new Kriging-based criterion for the sequential choice of points at which the function should be evaluated. Based on stepwise uncertainty reduction, it accounts for the informational gain on the minimizer expected from a new evaluation. The criterion is approximated using conditional simulations of the Gaussian process model behind Kriging, and then inserted into an algorithm similar in spirit to the Efficient Global Optimization (EGO) algorithm. An empirical comparison is carried out between our criterion and expected improvement, one of the reference criteria in the literature. Experimental results indicate major evaluation savings over EGO. Finally, the method, which we call IAGO (for Informational Approach to Global Optimization), is extended to robust optimization problems, where both the factors to be tuned and the function evaluations are corrupted by noise.