Elements of information theory
Elements of information theory
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
PAC-Bayesian Stochastic Model Selection
Machine Learning
Generalisation Error Bounds for Sparse Linear Classifiers
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Pac-bayesian generalisation error bounds for gaussian process classification
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Consistency of discrete Bayesian learning
Theoretical Computer Science
Hi-index | 0.00 |
We extend the PAC-Bayes theorem to the sample-compression setting where each classifier is represented by two independent sources of information: a compression set which consists of a small subset of the training data, and a message string of the additional information needed to obtain a classifier. The new bound is obtained by using a prior over a data-independent set of objects where each object gives a classifier only when the training data is provided. The new PAC-Bayes theorem states that a Gibbs classifier defined on a posterior over sample-compressed classifiers can have a smaller risk bound than any such (deterministic) sample-compressed classifier.