Leave-one-out bounds for kernel methods
Neural Computation
Gabor-Based Kernel PCA with Fractional Power Polynomial Models for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
The Journal of Machine Learning Research
Learning with sample dependent hypothesis spaces
Computers & Mathematics with Applications
Fast rates for support vector machines
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Concentration estimates for learning with unbounded sampling
Advances in Computational Mathematics
Hi-index | 0.00 |
The least-square regression problem is considered by coefficient-based regularization schemes with ℓ1驴驴penalty. The learning algorithm is analyzed with samples drawn from unbounded sampling processes. The purpose of this paper is to present an elaborate concentration estimate for the algorithms by means of a novel stepping stone technique. The learning rates derived from our analysis can be achieved in a more general setting. Our refined analysis will lead to satisfactory learning rates even for non-smooth kernels.