Nonlinear approximation theory
Nonlinear approximation theory
Toward efficient agnostic learning
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
The importance of convexity in learning with squared loss
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Learning distributions by their density-levels - a paradigm for learning without a teacher
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Localized Rademacher Complexities
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Efficient agnostic learning of neural networks with bounded fan-in
IEEE Transactions on Information Theory - Part 2
Rademacher averages and phase transitions in Glivenko-Cantelli classes
IEEE Transactions on Information Theory
Improving the sample complexity using global data
IEEE Transactions on Information Theory
Localized Rademacher Complexities
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
A few notes on statistical learning theory
Advanced lectures on machine learning
Obtaining fast error rates in nonconvex situations
Journal of Complexity
Hi-index | 0.00 |
We consider the sample complexity of agnostic learning with respect to squared loss. It is known that if the function class F used for learning is convex then one can obtain better sample complexity bounds than usual. It has been claimed that there is a lower bound that showed there was an essential gap in the rate. In this paper we show that the lower bound proof has a gap in it. Although we do not provide a definitive answer to its validity. More positively, we show one can obtain "fast" sample complexity bounds for nonconvex F for "most" target conditional expectations. The new bounds depend on the detailed geometry of F, in particular the distance in a certain sense of the target's conditional expectation from the set of nonuniqueness points of the class F.