The covering number in learning theory
Journal of Complexity
Support Vector Machines
Classification with Gaussians and Convex Loss
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Metric entropy quantities, like covering numbers or entropy numbers, and positive definite kernels play an important role in mathematical learning theory. Using smoothness properties of the Fourier transform of the kernels, Zhou [D.-X. Zhou, The covering number in learning theory, J. Complexity 18 (3) (2002) 739-767] proved an upper estimate for the covering numbers of the unit ball of Gaussian reproducing kernel Hilbert spaces (RKHSs), considered as a subset of the space of continuous functions. In this note we determine the exact asymptotic order of these covering numbers, exploiting an explicit description of Gaussian RKHSs via orthonormal bases. We show that Zhou's estimate is almost sharp (up to a double logarithmic factor), but his conjecture on the correct asymptotic rate is far too optimistic. Moreover we give an application of our entropy results to small deviations of certain smooth Gaussian processes.