The monotone theory for the PAC-model

  • Authors:
  • Nader H. Bshouty

  • Affiliations:
  • Department of Computer Science, Technion Israël Institute of Technology, Room 736, Taub Building, IL-32000 Haifa, Israel

  • Venue:
  • Information and Computation
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we extend the Monotone Theory to the PAC-learning Model with membership queries. Using this extention we show that a DNF formula that has at least one "1/poly-heavy" clause in one of its CNF representation (a clause that is not satisfied with probability 1/poly(n, s) where n is the number of variables and s is the number of terms in f) with respect to a distribution D is weakly learnable under this distribution. So DNF that are not weakly learnable under the distribution D do not have any "1/poly-heavy" clauses in any of their CNF representations.A DNF f is called τ-CDNF if there is τ' τ and a CNF representation of f that contains poly(n,s) clauses that τ'-approximates f according to a distribution D. We show that the class of all τ-CDNF is weakly (τ + ε)- PAC-learnable with membership queries under the distribution D.We then show how to change our algorithm to a parallel algorithm that runs in polylogarithmic time with a polynomial number of processors. In particular, decision trees are (strongly) PAC-learnable with membership queries under any distribution in parallel in polylogarithmic time with a polynomial number of processors. Finally, we show that no efficient parallel exact learning algorithm exists for decision trees.