A connectionist machine for genetic hillclimbing
A connectionist machine for genetic hillclimbing
Elements of information theory
Elements of information theory
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
An informational approach to the global optimization of expensive-to-evaluate functions
Journal of Global Optimization
Properties of an adaptive archiving algorithm for storing nondominated vectors
IEEE Transactions on Evolutionary Computation
Toward autonomic grids: analyzing the job flow with affinity streaming
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Variance or spectral density in sampled data filtering?
Journal of Global Optimization
Hi-index | 0.00 |
In many global optimization problems motivated by engineering applications, the number of function evaluations is severely limited by time or cost. To ensure that each of these evaluations usefully contributes to the localization of good candidates for the role of global minimizer, a stochastic model of the function can be built to conduct a sequential choice of evaluation points. Based on Gaussian processes and Kriging, the authors have recently introduced the informational approach to global optimization (IAGO) which provides a one-step optimal choice of evaluation points in terms of reduction of uncertainty on the location of the minimizers. To do so, the probability density of the minimizers is approximated using conditional simulations of the Gaussian process model behind Kriging. In this paper, an empirical comparison between the underlying sampling criterion called conditional minimizer entropy (CME) and the standard expected improvement sampling criterion (EI) is presented. Classical test functions are used as well as sample paths of the Gaussian model and an industrial application. They show the interest of the CME sampling criterion in terms of evaluation savings.