Representations of quasi-Newton matrices and their use in limited memory methods
Mathematical Programming: Series A and B
A limited memory algorithm for bound constrained optimization
SIAM Journal on Scientific Computing
The nature of statistical learning theory
The nature of statistical learning theory
Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization
ACM Transactions on Mathematical Software (TOMS)
Fitness landscapes and memetic algorithm design
New ideas in optimization
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Genetic Search with Approximate Function Evaluation
Proceedings of the 1st International Conference on Genetic Algorithms
Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Selected Papers from AISB Workshop on Evolutionary Computing
Two Applications of Gentic Algorithms to Component Design
Selected Papers from AISB Workshop on Evolutionary Computing
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
Evolving Objects: A General Purpose Evolutionary Computation Library
Selected Papers from the 5th European Conference on Artificial Evolution
Optimizing thermal design of data center cabinets with a new multi-objective genetic algorithm
Distributed and Parallel Databases
An effective intelligent algorithm for stochastic optimization problem
CCDC'09 Proceedings of the 21st annual international conference on Chinese control and decision conference
Causally-guided evolutionary optimization and its application to antenna array design
Integrated Computer-Aided Engineering
Hi-index | 0.00 |
A new mutation operator based on a surrogate model of the fitness function is introduced. The original features of this approach are 1-the model used to approximate the fitness, namely Support Vector Machines; 2-the adaptive granularity of the approximation, going from space-wide to closely localized around the best-so-far individual of the population; 3-the use of a deterministic optimization method on the surrogate model. The goal is to accelerate the convergence of the evolutionary algorithm, and not to reduce the number of evaluations of the actual fitness by evaluating the surrogate model instead. First results on benchmark functions of high dimensions show the potential improvement that this approach can bring in high-dimensional search spaces, and points out some limitations.