Information Processing Letters
PAC-Bayesian Stochastic Model Selection
Machine Learning
Microchoice Bounds and Self Bounding Learning Algorithms
Machine Learning
Tutorial on Practical Prediction Theory for Classification
The Journal of Machine Learning Research
Information-theoretic upper and lower bounds for statistical estimation
IEEE Transactions on Information Theory
Multi-classification by categorical features via clustering
Proceedings of the 25th international conference on Machine learning
Adaptive False Discovery Rate Control under Independence and Dependence
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Distribution-dependent PAC-bayes priors
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
PAC-Bayesian Analysis of Co-clustering and Beyond
The Journal of Machine Learning Research
Tighter PAC-Bayes bounds through distribution-dependent priors
Theoretical Computer Science
Hi-index | 0.00 |
We establish a generic theoretical tool to construct probabilistic bounds for algorithms where the output is a subset of objects from an initial pool of candidates (or more generally, a probability distribution on said pool). This general device, dubbed "Occam's hammer", acts as a meta layer when a probabilistic bound is already known on the objects of the pool taken individually, and aims at controlling the proportion of the objects in the set output not satisfying their individual bound. In this regard, it can be seen as a non-trivial generalization of the "union bound with a prior" ("Occam's razor"), a familiar tool in learning theory. We give applications of this principle to randomized classifiers (providing an interesting alternative approach to PAC-Bayes bounds) and multiple testing (where it allows to retrieve exactly and extend the so-called Benjamini-Yekutieli testing procedure).