Polynomial-Time Decomposition Algorithms for Support Vector Machines
Machine Learning
QP Algorithms with Guaranteed Accuracy and Run Time for Support Vector Machines
The Journal of Machine Learning Research
The Journal of Machine Learning Research
General Polynomial Time Decomposition Algorithms
The Journal of Machine Learning Research
Support Vector Machines
Fast rates for support vector machines
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Performance Measures for Neyman–Pearson Classification
IEEE Transactions on Information Theory
An explicit description of the extended gaussian kernel
PAKDD'12 Proceedings of the 2012 Pacific-Asia conference on Emerging Trends in Knowledge Discovery and Data Mining
Robust kernel density estimation
The Journal of Machine Learning Research
Hi-index | 0.00 |
We describe how to use Schoenberg's theorem for a radial kernel combined with existing bounds on the approximation error functions for Gaussian kernels to obtain a bound on the approximation error function for the radial kernel. The result is applied to the exponential kernel and Student's kernel. To establish these results we develop a general theory regarding mixtures of kernels. We analyze the reproducing kernel Hilbert space (RKHS) of the mixture in terms of the RKHS's of the mixture components and prove a type of Jensen inequality between the approximation error function for the mixture and the approximation error functions of the mixture components.