Model selection
Trust-region methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Metamodel-Assisted Evolution Strategies
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Radial Basis Functions
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A Nonlinear Mapping for Data Structure Analysis
IEEE Transactions on Computers
A framework for memetic optimization using variable global and local surrogate models
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special Issue on Emerging Trends in Soft Computing - Memetic Algorithms; Guest Editors: Yew-Soon Ong, Meng-Hiot Lim, Ferrante Neri, Hisao Ishibuchi
Computational Intelligence in Expensive Optimization Problems
Computational Intelligence in Expensive Optimization Problems
Feasibility structure modeling: an effective chaperone for constrained memetic algorithms
IEEE Transactions on Evolutionary Computation - Special issue on preference-based multiobjective evolutionary algorithms
Handling undefined vectors in expensive optimization problems
EvoApplicatons'10 Proceedings of the 2010 international conference on Applications of Evolutionary Computation - Volume Part I
Accelerating evolutionary algorithms with Gaussian process fitness function models
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A framework for evolutionary optimization with approximate fitnessfunctions
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
Real-world engineering design optimization problems often rely on computationally-expensive simulations to replace laboratory experiments. A common optimization approach is to approximate the expensive simulation with a computationally cheaper model resulting in a model-assisted optimization algorithm. A prevalent issue in such optimization problems is that the simulation may crash for some input vectors, a scenario which increases the optimization difficulty and results in wasted computer resources. While a common approach to handle such vectors is to assign them a penalized fitness and incorporate them in the model training set this can result in severe model deformation and degrade the optimization efficacy. As an alternative we propose a classifier-assisted framework where a classifier is incorporated into the optimization search and biases the optimizer away from vectors predicted to crash to simulator and with no model deformation. Performance analysis shows the proposed framework improves performance with respect to the penalty approach and that it may be possible to ‘knowledge-mine' the classifier as a post-optimization stage to gain new insights into the problem being solved.