Rademacher and Gaussian Complexities: Risk Bounds and Structural Results
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Geometric Parameters of Kernel Machines
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Agnostic Learning Nonconvex Function Classes
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Entropy, Combinatorial Dimensions and Random Averages
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
A few notes on statistical learning theory
Advanced lectures on machine learning
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
On the Importance of Small Coordinate Projections
The Journal of Machine Learning Research
Generalization analysis of listwise learning-to-rank algorithms
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Estimating Labels from Label Proportions
The Journal of Machine Learning Research
Rademacher chaos complexities for learning the kernel problem
Neural Computation
A novel multi-view classifier based on Nyström approximation
Expert Systems with Applications: An International Journal
Margin-sparsity trade-off for the set covering machine
ECML'05 Proceedings of the 16th European conference on Machine Learning
Non-asymptotic quality assessment of generalised FIR models with periodic inputs
Automatica (Journal of IFAC)
Risk bounds of learning processes for Lévy processes
The Journal of Machine Learning Research
Multi-instance learning with any hypothesis class
The Journal of Machine Learning Research
Distribution-dependent sample complexity of large margin learning
The Journal of Machine Learning Research
Hi-index | 754.84 |
We introduce a new parameter which may replace the fat-shattering dimension. Using this parameter we are able to provide improved complexity estimates for the agnostic learning problem with respect to any Lp norm. Moreover, we show that if fatε(F) = O(ε-p) then F displays a clear phase transition which occurs at p=2. The phase transition appears in the sample complexity estimates, covering numbers estimates, and in the growth rate of the Rademacher averages associated with the class. As a part of our discussion, we prove the best known estimates on the covering numbers of a class when considered as a subset of Lp spaces. We also estimate the fat-shattering dimension of the convex hull of a given class. Both these estimates are given in terms of the fat-shattering dimension of the original class