On the Proper Learning of Axis Parallel Concepts

  • Authors:
  • Nader H. Bshouty;Lynn Burroughs

  • Affiliations:
  • -;-

  • Venue:
  • COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the proper learnability of axis parallel concept classes in the PAC learning model and in the exact learning model with membership and equivalence queries. These classes include union of boxes, DNF, decision trees and multivariate polynomials.For the constant dimensional axis parallel concepts C we show that the following problems have the same time complexity 1. C is 驴-properly exactly learnable (with hypotheses of size at most a times the target size) from membership and equivalence queries. 2. C is 驴-properly PAC learnable (without membership queries) under any product distribution. 3. There is an 驴-approximation algorithm for the MINEQUIC problem. (given a g 驴 C find a minimal size f 驴 C that is equivalent to g).In particular, C is 驴-properly learnable in poly time from membership and equivalence queries if and only if C is 驴-properly PAC learnable in poly time under the product distribution if and only if MINEQUIC has a poly time 驴-approximation algorithm. Using this result we give the first proper learning algorithm of decision trees over the constant dimensional domain and the first negative results in proper learning from membership and equivalence queries for many classes.For the non-constant dimensional axis parallel concepts we show that with the equivalence oracle (1) 驴 (3). We use this to show that (binary) decision trees are not properly learnable in polynomial time (assuming P驴NP) and DNF is not s驴-properly learnable (驴 2p 驴 PNP).