Design of Computer Experiments for Metamodel Generation
Analog Integrated Circuits and Signal Processing
Expensive optimization, uncertain environment: an EA-based solution
Proceedings of the 9th annual conference companion on Genetic and evolutionary computation
Reduced computation for evolutionary optimization in noisy environment
Proceedings of the 10th annual conference companion on Genetic and evolutionary computation
Blood sugar regularization based evolutionary algorithm for data classification
Applied Soft Computing
International Journal of Metaheuristics
Hi-index | 0.00 |
Optimization problems that arise in engineering design are often characterized by several features that hinder the use of standard nonlinear optimization techniques. Foremost among these features is that the functions used to define the engineering optimization problem often are computationally intensive. Within a standard nonlinear optimization algorithm, the computational expense of evaluating the functions that define the problem would necessarily be incurred for each iteration of the optimization algorithm. Faced with such prohibitive computational costs, an attractive alternative is to make use of surrogates within an optimization context since surrogates can be chosen or constructed so that they are typically much less expensive to compute. For the purposes of this paper, we will focus on the use of algebraic approximations as surrogates for the objective. In this paper we introduce the use of so-called merit functions that explicitly recognize the desirability of improving the current approximation to the objective during the course of the optimization. We define and experiment with the use of merit functions chosen to simultaneously improve both the solution to the optimization problem (the objective) and the quality of the approximation. Our goal is to further improve the effectiveness of our general approach without sacrificing any of its rigor.