Noise-tolerant distribution-free learning of general geometric concepts

  • Authors:
  • Nader H. Bshouty;Sally A. Goldman;H. David Mathias;Subhash Suri;Hisao Tamaki

  • Affiliations:
  • Technion–Israel Institute of Technology, Haifa, Israel;Washington Univ., St. Louis, MO;Ohio State Univ., Columbus;Washington Univ., St. Louis, MO;IBM Tokyo Research Lab, Yamato, Japan

  • Venue:
  • Journal of the ACM (JACM)
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an efficient algorithm for PAC-learning a very general class of geometric concepts over R d for fixed d. More specifically, let T be any set of s halfspaces. Let x =(x1, …, xd) be an arbitrary point in R d. With each t∈T we associate a boolean indicator function It(x) which is 1 if and only if x is in the halfspace t. The concept class, Cds , that we study consists of all concepts formed by any Boolean function over It1, …, Its for ti ∈T . This class is much more general than any geometric concept class known to be PAC-learnable. Our results can be extended easily to learn efficiently any Boolean combination of a polynomial number of concepts selected from any concept class C over R given that the VC-dimension of C has dependence only on d and there is a polynomial time algorithm to determine if there is a concept from C consistent with a given set of labeled examples. We also present a statistical query version of our algorithm that can tolerate random classification noise. Finally we present a generalization of the standard &egr;-net result of Haussler and Welzl [1987] and apply it to give an alternative noise-tolerant algorithm for d = 2 based on geometric subdivisions.