Shifting: One-inclusion mistake bounds and sample compression
Journal of Computer and System Sciences
Recursive teaching dimension, learning complexity, and maximum classes
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
A geometric approach to sample compression
The Journal of Machine Learning Research
Sauer's bound for a notion of teaching complexity
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Information Processing Letters
Hi-index | 0.00 |
Maximum concept classes of VC dimension d over n domain points have size n C ≤d, and this is an upper bound on the size of any concept class of VC dimension d over n points. We give a compression scheme for any maximum class that represents each concept by a subset of up to d unlabeled domain points and has the property that for any sample of a concept in the class, the representative of exactly one of the concepts consistent with the sample is a subset of the domain of the sample. This allows us to compress any sample of a concept in the class to a subset of up to d unlabeled sample points such that this subset represents a concept consistent with the entire original sample. Unlike the previously known compression scheme for maximum classes (Floyd and Warmuth, 1995) which compresses to labeled subsets of the sample of size equal d, our new scheme is tight in the sense that the number of possible unlabeled compression sets of size at most d equals the number of concepts in the class.