Hardness of Reconstructing Multivariate Polynomials over Finite Fields

  • Authors:
  • Parikshit Gopalan;Subhash Khot;Rishi Saket

  • Affiliations:
  • -;-;-

  • Venue:
  • FOCS '07 Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the polynomial reconstruction problem for low-degree multivariate polynomials over \mathbb{F}\left[ 2 \right]. In this problem, we are given a set of points {\rm X} \in \left\{ {0,1} \right\}^n and target values f({\rm X}) \in \left\{ {0,1} \right\} for each of these points, with the promise that there is a polynomial over \mathbb{F}\left[ 2 \right] of degree at most d that agrees with f at 1 - \varepsilon fraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2^{ - d}+ \delta fraction of the points for any \varepsilon, \delta \le 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d = 1, which follows from a celebrated result of Håstad [16]. In the setting of Computational Learning, our result shows the hardness of (non-proper)agnostic learning of parities, where the learner is allowed a low-degree polynomial over \mathbb{F}\left[ 2 \right] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results extend to multivariate polynomial reconstruction over any finite field.