TY - GEN
T1 - Hardness of reconstructing multivariate polynomials over finite fields
AU - Gopalan, Parikshit
AU - Khot, Subhash
AU - Saket, Rishi
PY - 2007
Y1 - 2007
N2 - We study the polynomial reconstruction problem for low-degree multivariate polynomials over double-struck F sign[2]. In this problem, we are given a set of points x ∈{0, 1}n and target values f(x) ∈ {0, 1} for each of these points, with the promise that there is a polynomial over double-struck F sign[2] of degree at most d that agrees with f at 1 - ε fraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2-d + δ fraction of the points for any ε, δ > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d = 1, which follows from a celebrated result of Håstad [16]. In the setting of Computational Learning, our result shows the hardness of (non-proper)agnostic learning of parities, where the learner is allowed a low-degree polynomial over double-struck F sign[2] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results extend to multivariate polynomial reconstruction over any finite field.
AB - We study the polynomial reconstruction problem for low-degree multivariate polynomials over double-struck F sign[2]. In this problem, we are given a set of points x ∈{0, 1}n and target values f(x) ∈ {0, 1} for each of these points, with the promise that there is a polynomial over double-struck F sign[2] of degree at most d that agrees with f at 1 - ε fraction of the points. Our goal is to find a degree d polynomial that has good agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2-d + δ fraction of the points for any ε, δ > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, whereas the algorithm is allowed to find a polynomial of degree d. Previously the only known hardness of approximation (or even NP-completeness) was for the case when d = 1, which follows from a celebrated result of Håstad [16]. In the setting of Computational Learning, our result shows the hardness of (non-proper)agnostic learning of parities, where the learner is allowed a low-degree polynomial over double-struck F sign[2] as a hypothesis. This is the first nonproper hardness result for this central problem in computational learning. Our results extend to multivariate polynomial reconstruction over any finite field.
UR - http://www.scopus.com/inward/record.url?scp=46749152630&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=46749152630&partnerID=8YFLogxK
U2 - 10.1109/FOCS.2007.4389506
DO - 10.1109/FOCS.2007.4389506
M3 - Conference contribution
AN - SCOPUS:46749152630
SN - 0769530109
SN - 9780769530109
T3 - Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS
SP - 349
EP - 359
BT - Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2007
T2 - 48th Annual Symposium on Foundations of Computer Science, FOCS 2007
Y2 - 20 October 2007 through 23 October 2007
ER -