Constant depth circuits, Fourier transform, and learnability
Journal of the ACM (JACM)
An O(nlog log n) learning algorithm for DNF under the uniform distribution
Journal of Computer and System Sciences
BPP has subexponential time simulations unless EXPTIME has publishable proofs
Computational Complexity
On the Fourier spectrum of monotone functions
Journal of the ACM (JACM)
P = BPP if E requires exponential circuits: derandomizing the XOR lemma
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
Uniform-distribution attribute noise learnability
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Pseudorandom generators without the XOR lemma
Journal of Computer and System Sciences - Special issue on the fourteenth annual IEE conference on computational complexity
Hard-core distributions for somewhat hard problems
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Towards Proving Strong Direct Product Theorems
CCC '01 Proceedings of the 16th Annual Conference on Computational Complexity
On the noise sensitivity of monotone functions
Random Structures & Algorithms
Using nondeterminism to amplify hardness
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Learning intersections and thresholds of halfspaces
Journal of Computer and System Sciences - Special issue on FOCS 2002
On uniform amplification of hardness in NP
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Learning DNF from random walks
Journal of Computer and System Sciences - Special issue: Learning theory 2003
On the fourier tails of bounded functions over the discrete cube
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
Foundations and Trends® in Theoretical Computer Science
General Pseudo-random Generators from Weaker Models of Computation
ISAAC '09 Proceedings of the 20th International Symposium on Algorithms and Computation
ICALP'11 Proceedings of the 38th international colloquim conference on Automata, languages and programming - Volume Part I
Computational randomness from generalized hardcore sets
FCT'11 Proceedings of the 18th international conference on Fundamentals of computation theory
Impossibility results on weakly black-box hardness amplification
FCT'07 Proceedings of the 16th international conference on Fundamentals of Computation Theory
On the complexity of hard-core set constructions
ICALP'07 Proceedings of the 34th international conference on Automata, Languages and Programming
Sparse extractor families for all the entropy
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Hi-index | 0.00 |
(MATH) In this paper we investigate the following question: If $\np$ is slightly hard on average, is it very hard on average? We show the answer is yes; if there is a function in $\np$ which is \mbox{$(1-1/\poly(n))$}-hard for circuits of polynomial size, then there is a function in $\np$ which is $(\half + n^{-1/2 + \epsilon})$-hard for circuits of polynomial size. Our proof technique is to generalize the Yao XOR Lemma, allowing us to characterize nearly tightly the hardness of a composite function \linebreak $g(f(x_1), \ldots, f(x_n))$, in terms of: (i) the original hardness of $f$, and (ii) the {\em expected bias} of the function $g$ when subjected to random restrictions. The computational result we prove essentially matches an information-theoretic bound.