The nature of statistical learning theory
The nature of statistical learning theory
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Optimizing resources in model selection for support vector machine
Pattern Recognition
Automatic model selection for the optimization of SVM kernels
Pattern Recognition
A geometric approach to Support Vector Machine (SVM) classification
IEEE Transactions on Neural Networks
Hi-index | 12.05 |
Support vector machine (SVM) has become one of the most popular methods in machine learning during the last years. The parameters' selection in SVM is an important step in achieving a high performance learning machine. Some methods are proposed by minimizing an estimate of generalization error based on bound of leave-one-out (LOO) bound, empirical error, etc. These methods have to optimize many quadratic programming problems and compute an inversion of the Gram-Schmidt matrix, which cause to be time-consuming in large-scale problems. This paper introduces a fast incremental method to optimize the kernel parameters in SVM by combining a geometric algorithm on SVM and an approximation of the gradient of the empirical error. This method shows an online way to update the kernel parameters and work set in incremental learning, which reduces the resources required both CPU time and storage space. The numerical tests on some benchmarks confirm our method.