The nature of statistical learning theory
The nature of statistical learning theory
Error estimates for interpolation by compactly supported radial basis functions of minimal degree
Journal of Approximation Theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Relative Expected Instantaneous Loss Bounds
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
The Journal of Machine Learning Research
Learning Bounds for Kernel Regression Using Effective Data Dimensionality
Neural Computation
Learning Coordinate Covariances via Gradients
The Journal of Machine Learning Research
On the Stability and Bias-Variance Analysis of Kernel Matrix Learning
CAI '07 Proceedings of the 20th conference of the Canadian Society for Computational Studies of Intelligence on Advances in Artificial Intelligence
Learning with sample dependent hypothesis spaces
Computers & Mathematics with Applications
Hermite learning with gradient data
Journal of Computational and Applied Mathematics
On Over-fitting in Model Selection and Subsequent Selection Bias in Performance Evaluation
The Journal of Machine Learning Research
Optimal learning rates for least squares regularized regression with unbounded sampling
Journal of Complexity
On complexity issues of online learning algorithms
IEEE Transactions on Information Theory
An approximation theory approach to learning with l1 regularization
Journal of Approximation Theory
Concentration estimates for learning with unbounded sampling
Advances in Computational Mathematics
Learning with coefficient-based regularization and ℓ1-penalty
Advances in Computational Mathematics
Hi-index | 0.06 |
In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-one-out errors. We apply our analysis to some classification and regression problems and compare them with previous results.