Models for iterative global optimization
Models for iterative global optimization
Evaluating evolutionary algorithms
Artificial Intelligence - Special volume on empirical methods
Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Building Blocks, Cohort Genetic Algorithms, and Hyperplane-Defined Functions
Evolutionary Computation
A comparison of predictive measures of problem difficulty inevolutionary algorithms
IEEE Transactions on Evolutionary Computation
Test-case generator for nonlinear continuous parameter optimizationtechniques
IEEE Transactions on Evolutionary Computation
Hi-index | 0.01 |
This paper investigates neural network training as a potential source of problems for benchmarking continuous, heuristic optimization algorithms. Through the use of a student-teacher learning paradigm, the error surfaces of several neural networks are examined using so-called fitness distance correlation, which has previously been applied to discrete, combinatorial optimization problems. The results suggest that the neural network training tasks offer a number of desirable properties for algorithm benchmarking, including the ability to scale-up to provide challenging problems in high-dimensional spaces.