Separating formal bounds from practical performance in learning systems
Separating formal bounds from practical performance in learning systems
How tight are the Vapnik-Chervonenkis bounds?
Neural Computation
Measuring the VC-dimension of a learning machine
Neural Computation
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
An introduction to computational learning theory
An introduction to computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Rigorous learning curve bounds from statistical mechanics
Machine Learning - Special issue on COLT '94
Effective classification learning
Effective classification learning
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Exact combinatorial bounds on the probability of overfitting for empirical risk minimization
Pattern Recognition and Image Analysis
Distribution-dependent sample complexity of large margin learning
The Journal of Machine Learning Research
Hi-index | 0.01 |
Vapnik-Chervonenkis (VC) bounds play an important role in statistical learning theory as they are the fundamental result which explains the generalization ability of learning machines. There have been consequent mathematical works on the improvement of VC rates of convergence of empirical means to their expectations over the years. The result obtained by Talagrand in 1994 seems to provide more or less the final word to this issue as far as universal bounds are concerned. Though for fixed distributions, this bound can be practically outperformed. We show indeed that it is possible to replace the 2∈2 under the exponential of the deviation term by the corresponding CramÉr transform as shown by large deviations theorems. Then, we formulate rigorous distributionsensitive VC bounds and we also explain why these theoretical results on such bounds can lead to practical estimates of the effective VC dimension of learning structures.