Making large-scale support vector machine learning practical
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Numerical Recipes in C: The Art of Scientific Computing
Numerical Recipes in C: The Art of Scientific Computing
Evolutionary Optimization in Dynamic Environments
Evolutionary Optimization in Dynamic Environments
Gado: a genetic algorithm for continuous design optimization
Gado: a genetic algorithm for continuous design optimization
Learning to be selective in genetic-algorithm-based design optimization
Artificial Intelligence for Engineering Design, Analysis and Manufacturing
A comprehensive survey of fitness approximation in evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Structure optimization of neural networks for evolutionary design optimization
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Comparison of methods for developing dynamic reduced models for design optimization
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Fitness inheritance for noisy evolutionary multi-objective optimization
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Combating user fatigue in iGAs: partial ordering, support vector machines, and synthetic fitness
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Accelerating evolutionary algorithms with Gaussian process fitness function models
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A surrogate-assisted linkage inference approach in genetic algorithms
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy
Proceedings of the 14th annual conference on Genetic and evolutionary computation
Surrogate-assisted evolutionary programming for high dimensional constrained black-box optimization
Proceedings of the 14th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
Genetic algorithms (GAs) used in complex optimization domains usually need to perform a large number of fitness function evaluations in order to get near-optimal solutions. In real world application domains such as the engineering design problems, such evaluations might be extremely expensive computationally. It is therefore common to estimate or approximate the fitness using certain methods. A popular method is to construct a so called surrogate or meta-model to approximate the original fitness function, which can simulate the behavior of the original fitness function but can be evaluated much faster. It is usually difficult to determine which approximate model should be used and/or what the frequency of usage should be. The answer also varies depending on the individual problem. To solve this problem, an adaptive fitness approximation GA (ASAGA) is presented. ASAGA adaptively chooses the appropriate model type; adaptively adjusts the model complexity and the frequency of model usage according to time spent and model accuracy. ASAGA also introduces a stochastic penalty function method to handle constraints. Experiments show that ASAGA outperforms non-adaptive surrogate-assisted GAs with statistical significance.