On noise-tolerant learning of sparse parities and related problems

  • Authors:
  • Elena Grigorescu;Lev Reyzin;Santosh Vempala

  • Affiliations:
  • School of Computer Science, Georgia Institute of Technology, Atlanta, GA;School of Computer Science, Georgia Institute of Technology, Atlanta, GA;School of Computer Science, Georgia Institute of Technology, Atlanta, GA

  • Venue:
  • ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of learning sparse parities in the presence of noise. For learning parities on r out of n variables, we give an algorithm that runs in time poly (log 1/δ, 1/1-2η)n(1+(2η)2+o(1))r/2 and uses only r log(n/δ)ω(1)/(1-2η)2 samples in the random noise setting under the uniform distribution, where η is the noise rate and δ is the confidence parameter. From previously known results this algorithm also works for adversarial noise and generalizes to arbitrary distributions. Even though efficient algorithms for learning sparse parities in the presence of noise would have major implications to learning other hypothesis classes, our work is the first to give a bound better than the brute-force O(nr). As a consequence, we obtain the first nontrivial bound for learning r-juntas in the presence of noise, and also a small improvement in the complexity of learning DNF, under the uniform distribution.