Journal of Computer and System Sciences
BPP has subexponential time simulations unless EXPTIME has publishable proofs
Computational Complexity
P = BPP if E requires exponential circuits: derandomizing the XOR lemma
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
Pseudorandom generators without the XOR lemma
Journal of Computer and System Sciences - Special issue on the fourteenth annual IEE conference on computational complexity
Simple Extractors for All Min-Entropies and a New Pseudo-Random Generator
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
Pseudo-random generators for all hardnesses
Journal of Computer and System Sciences - STOC 2002
Theoretical Computer Science
Hi-index | 0.01 |
The notion of average-case hardness is a fundamental one in complexity theory. In particular, it plays an important role in the research on derandomization, as there are general derandomization results which are based on the assumption that average-case hard functions exist. However, to achieve a complete derandomization, one usually needs a function which is extremely hard against a complexity class, in the sense that any algorithm in the class fails to compute the function on 1/2 -2-Ω(n) fraction of its n-bit inputs. Unfortunately, lower bound results are very rare and they are only known for very restricted complexity classes, and achieving such extreme hardness seems even more difficult. Motivated by this, we study the hardness against linear-size circuits of constant depth in this paper. We show that the parity function is extremely hard for them: any such circuit must fail to compute the parity function on at least 1/2 - 2-Ω(n) fraction of inputs.