The nature of statistical learning theory
The nature of statistical learning theory
Error estimates for interpolation by compactly supported radial basis functions of minimal degree
Journal of Approximation Theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Relative Loss Bounds for Temporal-Difference Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Relative Expected Instantaneous Loss Bounds
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
The Journal of Machine Learning Research
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Hi-index | 0.01 |
In this paper, we prove a general leave-one-out style cross-validation bound for Kernel methods. We apply this bound to some classification and regression problems, and compare the results with previously known bounds. One aspect of our analysis is that the derived expected generalization bounds reflect both approximation (bias) and learning (variance) properties of the underlying kernel methods. We are thus able to demonstrate the universality of certain learning formulations.