Geometric Parameters of Kernel Machines
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Agnostic Learning Nonconvex Function Classes
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Localized Rademacher Complexities
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Entropy, Combinatorial Dimensions and Random Averages
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
A few notes on statistical learning theory
Advanced lectures on machine learning
On the performance of kernel classes
The Journal of Machine Learning Research
On the rate of convergence of regularized boosting classifiers
The Journal of Machine Learning Research
On the Importance of Small Coordinate Projections
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Classification with non-i.i.d. sampling
Mathematical and Computer Modelling: An International Journal
On ranking and generalization bounds
The Journal of Machine Learning Research
Hi-index | 754.84 |
We study the sample complexity of proper and improper learning problems with respect to different q-loss functions. We improve the known estimates for classes which have relatively small covering numbers in empirical L2 spaces (e.g. log-covering numbers which are polynomial with exponent p<2). We present several examples of relevant classes which have a "small" fat-shattering dimension, and hence fit our setup, the most important of which are kernel machines