The nature of statistical learning theory
The nature of statistical learning theory
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Using the Fisher Kernel Method to Detect Remote Protein Homologies
Proceedings of the Seventh International Conference on Intelligent Systems for Molecular Biology
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Evolutionary tuning of multiple SVM parameters
Neurocomputing
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Journal of Biomedical Informatics
Second-order smo improves svm online and active learning
Neural Computation
The Journal of Machine Learning Research
Uncertainty Handling in Model Selection for Support Vector Machines
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
Gaussian kernel optimization for pattern classification
Pattern Recognition
Kernel Trees for Support Vector Machines
IEICE - Transactions on Information and Systems
A majorization-minimization algorithm for (multiple) hyperparameter learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Learning by local kernel polarization
Neurocomputing
Experiments on kernel tree support vector machines for text categorization
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Feature selection for SVM via optimization of kernel polarization with Gaussian ARD kernels
Expert Systems with Applications: An International Journal
Exponential natural evolution strategies
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Learning Translation Invariant Kernels for Classification
The Journal of Machine Learning Research
Evolutionary optimization of sequence kernels for detection of bacterial gene starts
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Learning bounds for support vector machines with learned kernels
COLT'06 Proceedings of the 19th annual conference on Learning Theory
ICCSA'12 Proceedings of the 12th international conference on Computational Science and Its Applications - Volume Part III
Computers and Electronics in Agriculture
Hi-index | 0.00 |
Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.