Algorithm 798: high-dimensional interpolation using the modified Shepard method
ACM Transactions on Mathematical Software (TOMS)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Journal of Global Optimization
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
A Radial Basis Function Method for Global Optimization
Journal of Global Optimization
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
A tutorial on support vector regression
Statistics and Computing
On the Design of Optimization Strategies Based on Global Response Surface Approximation Models
Journal of Global Optimization
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
State-of-the-Art Review: A User's Guide to the Brave New World of Designing Simulation Experiments
INFORMS Journal on Computing
Evolutionary Model Type Selection for Global Surrogate Modeling
The Journal of Machine Learning Research
Algorithm 905: SHEPPACK: Modified Shepard Algorithm for Interpolation of Scattered Multivariate Data
ACM Transactions on Mathematical Software (TOMS)
International Journal of Computational Fluid Dynamics - Simulation of Flow Control Using Dielectric-Barrier-Discharge Plasma Actuators
Expected improvement in efficient global optimization through bootstrapped kriging
Journal of Global Optimization
Hi-index | 0.00 |
Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.