Hardness of reconstructing multivariate polynomials over finite fields

Parikshit Gopalan, Subhash Khot, Rishi Saket

Research output: Contribution to journalArticlepeer-review

Abstract

We study the polynomial reconstruction problem for low-degree multivariate polynomials over finite field F[2]. In this problem, we are given a set of points x ε{0, 1}n and target values f (x) ε{0, 1} for each of these points, with the promise that there is a polynomial over F[2] of degree at most d that agrees with f at 1-εfraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1-2-d + δ fraction of the points for any ε,δ > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d =1, which follows from a celebrated result of Håstad [J. ACM, 48 (2001), pp. 798-859]. In the setting of Computational Learning, our result shows the hardness of nonproper agnostic learning of parities, where the l earner is allowed a low-degree polynomial over F[2] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results can be extended to multivariate polynomial reconstruction over any finite field.

Original languageEnglish (US)
Pages (from-to)2598-2621
Number of pages24
JournalSIAM Journal on Computing
Volume39
Issue number6
DOIs
StatePublished - 2010

Keywords

  • Coding theory
  • Computational learning
  • Finite fields
  • Hardness of approximation
  • Polynomials

ASJC Scopus subject areas

  • Computer Science(all)
  • Mathematics(all)

Fingerprint Dive into the research topics of 'Hardness of reconstructing multivariate polynomials over finite fields'. Together they form a unique fingerprint.

Cite this