COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
A sharp concentration inequality with application
Random Structures & Algorithms
The Journal of Machine Learning Research
PAC-Bayesian Analysis of Co-clustering and Beyond
The Journal of Machine Learning Research
Hi-index | 0.00 |
There exist many different generalization error bounds in statistical learning theory. Each of these bounds contains an improvement over the others for certain situations or algorithms. Our goal is, first, to underline the links between these bounds, and second, to combine the different improvements into a single bound. In particular we combine the PAC-Bayes approach introduced by McAllester (1998), which is interesting for randomized predictions, with the optimal union bound provided by the generic chaining technique developed by Fernique and Talagrand (see Talagrand, 1996), in a way that also takes into account the variance of the combined functions. We also show how this connects to Rademacher based bounds.