On the hardness of learning intersections of two halfspaces
Journal of Computer and System Sciences
Hardness of Reconstructing Multivariate Polynomials over Finite Fields
SIAM Journal on Computing
On noise-tolerant learning of sparse parities and related problems
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Hardness results for agnostically learning low-degree polynomial threshold functions
Proceedings of the twenty-second annual ACM-SIAM symposium on Discrete Algorithms
On learning finite-state quantum sources
Quantum Information & Computation
A complete characterization of statistical query learning with applications to evolvability
Journal of Computer and System Sciences
Solving the learning parity with noise's open question
Information Processing Letters
Clustering in the boolean hypercube in a list decoding regime
ICALP'13 Proceedings of the 40th international conference on Automata, Languages, and Programming - Volume Part I
Candidate weak pseudorandom functions in AC0 ○ MOD2
Proceedings of the 5th conference on Innovations in theoretical computer science
Improved Approximation of Linear Threshold Functions
Computational Complexity
Hi-index | 0.00 |
We study the learnability of several fundamental concept classes in the agnostic learning framework of [D. Haussler, Inform. and Comput., 100 (1992), pp. 78-150] and [M. Kearns, R. Schapire, and L. Sellie, Machine Learning, 17 (1994), pp. 115-141]. We show that under the uniform distribution, agnostically learning parities reduce to learning parities with random classification noise, commonly referred to as the noisy parity problem. Together with the parity learning algorithm of [A. Blum, A. Kalai, and H. Wasserman, J. ACM, 50 (2003), pp. 506-519], this gives the first nontrivial algorithm for agnostic learning of parities. We use similar techniques to reduce learning of two other fundamental concept classes under the uniform distribution to learning of noisy parities. Namely, we show that learning of disjunctive normal form (DNF) expressions reduces to learning noisy parities of just logarithmic number of variables, and learning of $k$-juntas reduces to learning noisy parities of $k$ variables. We give essentially optimal hardness results for agnostic learning of monomials over $\{0,1\}^n$ and halfspaces over $\mathbb{Q}^n$. We show that for any constant $\epsilon$ finding a monomial (halfspace) that agrees with an unknown function on $1/2+\epsilon$ fraction of the examples is NP-hard even when there exists a monomial (halfspace) that agrees with the unknown function on $1-\epsilon$ fraction of the examples. This resolves an open question due to Blum and significantly improves on a number of previous hardness results for these problems. We extend these results to $\epsilon=2^{-\log^{1-\lambda}n}$ ($\epsilon=2^{-\sqrt{\log n}}$ in the case of halfspaces) for any constant $\lambda0$ under stronger complexity assumptions.