Lower Bounds for Agnostic Learning via Approximate Rank

  • Authors:
  • Adam R. Klivans;Alexander A. Sherstov

  • Affiliations:
  • The University of Texas at Austin, Department of Computer Sciences, 1 University Station C0500, 78712-0233, Austin, TX, USA;The University of Texas at Austin, Department of Computer Sciences, 1 University Station C0500, 78712-0233, Austin, TX, USA

  • Venue:
  • Computational Complexity
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We prove that the concept class of disjunctions cannot be pointwise approximated by linear combinations of any small set of arbitrary real-valued functions. That is, suppose that there exist functions $$\phi_{1}, \ldots , \phi_{r}$$: {− 1, 1}n → $${\mathbb{R}}$$with the property that every disjunction f on n variables has $$\|f - \sum\nolimits_{i=1}^{r} \alpha_{i}\phi_{i}\|_{\infty}\leq 1/3$$ for some reals $$\alpha_{1}, \ldots , \alpha_{r}$$. We prove that then $$r \geq exp \{\Omega(\sqrt{n})\}$$, which is tight. We prove an incomparable lower bound for the concept class of decision lists. For the concept class of majority functions, we obtain a lower bound of $$\Omega(2^{n}/n)$$, which almost meets the trivial upper bound of 2n for any concept class. These lower bounds substantially strengthen and generalize the polynomial approximation lower bounds of Paturi (1992) and show that the regression-based agnostic learning algorithm of Kalai et al. (2005) is optimal.