The nature of statistical learning theory
The nature of statistical learning theory
An equivalence between sparse approximation and support vector machines
Neural Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Gradient-Based Adaptation of General Gaussian Kernels
Neural Computation
Training SVM with indefinite kernels
Proceedings of the 25th international conference on Machine learning
Optimization Algorithms on Matrix Manifolds
Optimization Algorithms on Matrix Manifolds
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Fast pruning superfluous support vectors in SVMs
Pattern Recognition Letters
Hi-index | 0.10 |
We propose a new method for general Gaussian kernel hyperparameter optimization for support vector machines classification. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector machine for classification and with other methods of the state of the art on toy data and on real world data sets.