On Using Extended Statistical Queries to Avoid Membership Queries

  • Authors:
  • Nader H. Bshouty;Vitaly Feldman

  • Affiliations:
  • -;-

  • Venue:
  • COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Kushilevitz-Mansour (KM)algorithm is an algorithm that finds all the "heavy" Fourier coefficients of a boolean function. It is the main tool for learning decision trees and DNF expressions in the PAC model with respect to the uniform distribution. The algorithm requires an access to the membership query (MQ)oracle. We weaken this requirement by producing an analogue of the KM algorithm that uses extended statistical queries (SQ)(SQs in which the expectation is taken with respect to a distribution given by a learning algorithm). We restrict a set of distributions that a learning algorithm may use for its SQs to be a set of specific constant bounded product distributions. Our analogue finds all the "heavy" Fourier coefficients of degree lower than c log n (we call it BS). We use BS to learn decision trees and by adapting Freund's boosting technique we give algorithm that learns DNF in this model. Learning in this model implies learning with persistent classification noise and in some cases can be extended to learning with product attribute noise. We develop a characterization for learnability with these extended SQs and apply it to get several negative results about the model.