The nature of statistical learning theory
The nature of statistical learning theory
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A review of genetic algorithms applied to training radial basis function networks
Neural Computing and Applications
The Genetic Kernel Support Vector Machine: Description and Evaluation
Artificial Intelligence Review
Completely Derandomized Self-Adaptation in Evolution Strategies
Evolutionary Computation
Design and Analysis of Experiments
Design and Analysis of Experiments
Evolutionary tuning of multiple SVM parameters
Neurocomputing
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Automatic induction of projection pursuit indices
IEEE Transactions on Neural Networks
Towards collaborative feature extraction for face recognition
Natural Computing: an international journal
Web mining based extraction of problem solution ideas
Expert Systems with Applications: An International Journal
A nested heuristic for parameter tuning in Support Vector Machines
Computers and Operations Research
Hi-index | 0.01 |
Optimal parameter model finding is usually a crucial task in engineering applications of classification and modelling. The exponential cost of linear search on a parameter grid of a given precision rules it out in all but the simplest problems and random algorithms such as uniform design or the covariance matrix adaptation-evolution strategy (CMA-ES) are usually applied. In this work we shall present two focused grid search (FGS) alternatives in which one repeatedly zooms into more concentrated sets of discrete grid points in the parameter search space. The first one, deterministic FGS (DFGS), is much faster than standard search although still too costly in problems with a large number of parameters. The second one, annealed FGS (AFGS), is a random version of DFGS where a fixed fraction of grid points is randomly selected and examined. As we shall numerically see over several classification problems for multilayer perceptrons and support vector machines, DFGS and AFGS are competitive with respect to CMA-ES, one of the most successful evolutive black-box optimizers. The choice of a concrete technique may thus rest in other facts, and the simplicity and basically parameter-free nature of both DFGS and AFGS may make them worthwile alternatives to the thorough theoretical and experimental background of CMA-ES.