Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Constant depth circuits, Fourier transform, and learnability
Journal of the ACM (JACM)
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
Journal of Computer and System Sciences
SIAM Journal on Computing
On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
Theoretical Computer Science
On the power of unique 2-prover 1-round games
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
On the difficulty of approximately maximizing agreements
Journal of Computer and System Sciences
Agnostically Learning Halfspaces
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Maximizing agreements and coagnostic learning
Theoretical Computer Science - Algorithmic learning theory(ALT 2002)
Every Linear Threshold Function has a Low-Weight Approximator
Computational Complexity
Optimal Inapproximability Results for MAX-CUT and Other 2-Variable CSPs?
SIAM Journal on Computing
Learning Monotone Decision Trees in Polynomial Time
SIAM Journal on Computing
On hardness of learning intersection of two halfspaces
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Learning Geometric Concepts via Gaussian Surface Area
FOCS '08 Proceedings of the 2008 49th Annual IEEE Symposium on Foundations of Computer Science
Explicit construction of a small epsilon-net for linear threshold functions
Proceedings of the forty-first annual ACM symposium on Theory of computing
On Agnostic Learning of Parities, Monomials, and Halfspaces
SIAM Journal on Computing
Hardness of Learning Halfspaces with Noise
SIAM Journal on Computing
Agnostic Learning of Monomials by Halfspaces Is Hard
FOCS '09 Proceedings of the 2009 50th Annual IEEE Symposium on Foundations of Computer Science
Bounding the average sensitivity and noise sensitivity of polynomial threshold functions
Proceedings of the forty-second ACM symposium on Theory of computing
The Gaussian Surface Area and Noise Sensitivity of Degree-d Polynomial Threshold Functions
CCC '10 Proceedings of the 2010 IEEE 25th Annual Conference on Computational Complexity
Hardness of Reconstructing Multivariate Polynomials over Finite Fields
SIAM Journal on Computing
Hi-index | 0.00 |
Hardness results for maximum agreement problems have close connections to hardness results for proper learning in computational learning theory. In this paper we prove two hardness results for the problem of finding a low degree polynomial threshold function (PTF) which has the maximum possible agreement with a given set of labeled examples in Rn x {−1, 1}. We prove that for any constants d ≥ 1, ε 0, • Assuming the Unique Games Conjecture, no polynomial-time algorithm can find a degree-d PTF that is consistent with a (1/2 + ε) fraction of a given set of labeled examples in Rn x {−1, 1}, even if there exists a degree-d PTF that is consistent with a 1 − ε fraction of the examples. • It is NP-hard to find a degree-2 PTF that is consistent with a (1/2 + ε) fraction of a given set of labeled examples in Rn x {−1, 1}, even if there exists a half-space (degree-1 PTF) that is consistent with a 1 − ε fraction of the examples. These results immediately imply the following hardness of learning results: (i) Assuming the Unique Games Conjecture, there is no better-than-trivial proper learning algorithm that agnostically learns degree-d PTFs under arbitrary distributions; (ii) There is no better-than-trivial learning algorithm that outputs degree-2 PTFs and agnostically learns halfspaces (i.e. degree-1 PTFs) under arbitrary distributions.