Communications of the ACM
Algebraic methods in the theory of lower bounds for Boolean circuit complexity
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
A hard-core predicate for all one-way functions
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Learning decision trees using the Fourier spectrum
SIAM Journal on Computing
Constant depth circuits, Fourier transform, and learnability
Journal of the ACM (JACM)
Decoding of Reed Solomon codes beyond the error-correction bound
Journal of Complexity
Probabilistic checking of proofs: a new characterization of NP
Journal of the ACM (JACM)
Proof verification and the hardness of approximation problems
Journal of the ACM (JACM)
SIAM Journal on Computing
A threshold of ln n for approximating set cover
Journal of the ACM (JACM)
Learning Polynomials with Queries: The Highly Noisy Case
SIAM Journal on Discrete Mathematics
Pseudorandom generators without the XOR lemma
Journal of Computer and System Sciences - Special issue on the fourteenth annual IEE conference on computational complexity
Some optimal inapproximability results
Journal of the ACM (JACM)
Noise-tolerant learning, the parity problem, and the statistical query model
Journal of the ACM (JACM)
List Decoding of Error-Correcting Codes: Winning Thesis of the 2002 ACM Doctoral Dissertation Competition (Lecture Notes in Computer Science)
Correcting Errors Beyond the Guruswami-Sudan Radius in Polynomial Time
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Extractors from Reed-Muller codes
Journal of Computer and System Sciences - Special issue on FOCS 2001
On the List and Bounded Distance Decodability of Reed-Solomon Codes
SIAM Journal on Computing
Optimal Inapproximability Results for MAX-CUT and Other 2-Variable CSPs?
SIAM Journal on Computing
Pseudorandom Bits for Polynomials
FOCS '07 Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
List-decoding reed-muller codes over small fields
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
On hardness of learning intersection of two halfspaces
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Unconditional pseudorandom generators for low degree polynomials
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
On agnostic boosting and parity learning
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Hardness of Minimizing and Learning DNF Expressions
FOCS '08 Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science
On Agnostic Learning of Parities, Monomials, and Halfspaces
SIAM Journal on Computing
Agnostic Learning of Monomials by Halfspaces Is Hard
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
Improved decoding of Reed-Solomon and algebraic-geometry codes
IEEE Transactions on Information Theory
Maximum-likelihood decoding of Reed-Solomon codes is NP-hard
IEEE Transactions on Information Theory
Explicit Codes Achieving List Decoding Capacity: Error-Correction With Optimal Redundancy
IEEE Transactions on Information Theory
Hi-index | 0.01 |
We study the polynomial reconstruction problem for low-degree multivariate polynomials over finite field $\mathbb{F}[2]$. In this problem, we are given a set of points $\mathbf{x}\in\{0,1\}^n$ and target values $f(\mathbf{x})\in\{0,1\}$ for each of these points, with the promise that there is a polynomial over $\mathbb{F}[2]$ of degree at most $d$ that agrees with $f$ at $1-\varepsilon$ fraction of the points. Our goal is to find a degree $d$ polynomial that has good agreement with $f$. We show that it is NP-hard to find a polynomial that agrees with $f$ on more than $1-2^{-d}+\delta$ fraction of the points for any $\epsilon,\delta0$. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree $d$. Previously the only known hardness of approximation (or even NP-completeness) was for the case when $d =1$, which follows from a celebrated result of Håstad [J. ACM, 48 (2001), pp. 798-859]. In the setting of Computational Learning, our result shows the hardness of nonproper agnostic learning of parities, where the learner is allowed a low-degree polynomial over $\mathbb{F}[2]$ as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results can be extended to multivariate polynomial reconstruction over any finite field.