Hardness of Reconstructing Multivariate Polynomials over Finite Fields

  • Authors:
  • Parikshit Gopalan;Subhash Khot;Rishi Saket

  • Affiliations:
  • parik@microsoft.com;khot@cs.nyu.edu;saket@cc.gatech.edu

  • Venue:
  • SIAM Journal on Computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

We study the polynomial reconstruction problem for low-degree multivariate polynomials over finite field $\mathbb{F}[2]$. In this problem, we are given a set of points $\mathbf{x}\in\{0,1\}^n$ and target values $f(\mathbf{x})\in\{0,1\}$ for each of these points, with the promise that there is a polynomial over $\mathbb{F}[2]$ of degree at most $d$ that agrees with $f$ at $1-\varepsilon$ fraction of the points. Our goal is to find a degree $d$ polynomial that has good agreement with $f$. We show that it is NP-hard to find a polynomial that agrees with $f$ on more than $1-2^{-d}+\delta$ fraction of the points for any $\epsilon,\delta0$. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree $d$. Previously the only known hardness of approximation (or even NP-completeness) was for the case when $d =1$, which follows from a celebrated result of Håstad [J. ACM, 48 (2001), pp. 798-859]. In the setting of Computational Learning, our result shows the hardness of nonproper agnostic learning of parities, where the learner is allowed a low-degree polynomial over $\mathbb{F}[2]$ as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results can be extended to multivariate polynomial reconstruction over any finite field.