Diffusion Kernels on Statistical Manifolds
The Journal of Machine Learning Research
Learning Bounds for Kernel Regression Using Effective Data Dimensionality
Neural Computation
The covering number for some Mercer kernel Hilbert spaces
Journal of Complexity
Classification with a Reject Option using a Hinge Loss
The Journal of Machine Learning Research
Hi-index | 754.84 |
Support vector (SV) machines are linear classifiers that use the maximum margin hyperplane in a feature space defined by a kernel function. Previously, the only bounds on the generalization performance of SV machines (within Valiant's probably approximately correct framework) took no account of the kernel used except in its effect on the margin and radius. It has been shown that one can bound the relevant covering numbers using tools from functional analysis. In this paper, we show that the resulting bound can be greatly simplified. The new bound involves the eigenvalues of the integral operator induced by the kernel. It shows that the effective dimension depends on the rate of decay of these eigenvalues. We present an explicit calculation of covering numbers for an SV machine using a Gaussian kernel, which is significantly better than that implied by previous results