Unlabeled Compression Schemes for Maximum Classes

  • Authors:
  • Dima Kuzmin;Manfred K. Warmuth

  • Affiliations:
  • -;-

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Maximum concept classes of VC dimension d over n domain points have size n C ≤d, and this is an upper bound on the size of any concept class of VC dimension d over n points. We give a compression scheme for any maximum class that represents each concept by a subset of up to d unlabeled domain points and has the property that for any sample of a concept in the class, the representative of exactly one of the concepts consistent with the sample is a subset of the domain of the sample. This allows us to compress any sample of a concept in the class to a subset of up to d unlabeled sample points such that this subset represents a concept consistent with the entire original sample. Unlike the previously known compression scheme for maximum classes (Floyd and Warmuth, 1995) which compresses to labeled subsets of the sample of size equal d, our new scheme is tight in the sense that the number of possible unlabeled compression sets of size at most d equals the number of concepts in the class.