A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Gradient-Based Optimization of Hyperparameters
Neural Computation
A model for parameter setting based on Bayesian networks
Engineering Applications of Artificial Intelligence
Genetic algorithm-based multi-objective optimization of cutting parameters in turning processes
Engineering Applications of Artificial Intelligence
Engineering Applications of Artificial Intelligence
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Many practical engineering applications require the usage of accurate automatic decision systems, usually operating under tight computational constraints. Support Vector Machines (SVMs) endowed with a Radial Basis Function (RBF) as kernel are broadly accepted as the current state of the art for decision problems, but require cross-validation to select the free parameters, which is computationally costly. In this work we investigate low-cost methods to select the spread parameter in SVMs with an RBF kernel. Our proposal relies on the use of simple local methods that gather information about the local structure of each dataset. Empirical results in UCI datasets show that the proposed methods can be used as a fast alternative to the standard cross-validation procedure, with the additional advantage of avoiding the (often heuristic) task of a priori fixing the values of the spread parameter to be explored.