Average case complete problems
SIAM Journal on Computing
On the theory of average case complexity
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Random-self-reducibility of complete sets
SIAM Journal on Computing
Journal of Computer and System Sciences
BPP has subexponential time simulations unless EXPTIME has publishable proofs
Computational Complexity
P = BPP if E requires exponential circuits: derandomizing the XOR lemma
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
A Pseudorandom Generator from any One-way Function
SIAM Journal on Computing
Randomness vs. Time: De-Randomization under a Uniform Assumption
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
A personal view of average-case complexity
SCT '95 Proceedings of the 10th Annual Structure in Complexity Theory Conference (SCT'95)
Pseudorandomness and Average-Case Complexity via Uniform Reductions
CCC '02 Proceedings of the 17th IEEE Annual Conference on Computational Complexity
How to Go Beyond the Black-Box Simulation Barrier
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
On Worst-Case to Average-Case Reductions for NP Problems
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
On uniform amplification of hardness in NP
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
If NP Languages are Hard on the Worst-Case Then It is Easy to Find Their Hard Instances
CCC '05 Proceedings of the 20th Annual IEEE Conference on Computational Complexity
On basing one-way functions on NP-hardness
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
On the randomness complexity of efficient sampling
Proceedings of the thirty-eighth annual ACM symposium on Theory of computing
Distinguishing SAT from Polynomial-Size Circuits, through Black-Box Queries
CCC '06 Proceedings of the 21st Annual IEEE Conference on Computational Complexity
One-way functions are essential for complexity based cryptography
SFCS '89 Proceedings of the 30th Annual Symposium on Foundations of Computer Science
No better ways to generate hard NP instances than picking uniformly at random
SFCS '90 Proceedings of the 31st Annual Symposium on Foundations of Computer Science
Relativized worlds without worst-case to average-case reductions for NP
APPROX/RANDOM'10 Proceedings of the 13th international conference on Approximation, and 14 the International conference on Randomization, and combinatorial optimization: algorithms and techniques
APPROX'11/RANDOM'11 Proceedings of the 14th international workshop and 15th international conference on Approximation, randomization, and combinatorial optimization: algorithms and techniques
Relativized Worlds without Worst-Case to Average-Case Reductions for NP
ACM Transactions on Computation Theory (TOCT)
Hi-index | 0.00 |
A fundamental goal of computational complexity (and foundations of cryptography) is to find a polynomial-time samplable distribution (e.g., the uniform distribution) and a language in NTIME(f(n)) for some polynomial function f, such that the language is hard on the average with respect to this distribution, given that NP is worst-case hard (i.e. NP 茂戮驴 P, or ${\rm NP} \not \subseteq {\rm BPP}$). Currently, no such result is known even if we relax the language to be in nondeterministic sub-exponential time. There has been a long line of research trying to explain our failure in proving such worst-case/average-case connections [FF93,Vio03,BT03,AGGM06]. The bottom line of this research is essentially that (under plausible assumptions) non-adaptive Turing reductions cannot prove such results.In this paper we revisit the problem. Our first observation is that the above mentioned negative arguments extend to a non-standard notion of average-case complexity, in which the distribution on the inputs with respect to which we measure the average-case complexity of the language, is only samplable in super-polynomial time. The significance of this result stems from the fact that in this non-standard setting,[GSTS05] did show a worst-case/average-case connection. In other words, their techniques give a way to bypass the impossibility arguments. By taking a closer look at the proof of [GSTS05], we discover that the worst-case/average-case connection is proven by a reduction that "almost" falls under the category ruled out by the negative result. This gives rise to an intriguing new notion of (almost black-box) reductions.After extending the negative results to the non-standard average-case setting of [GSTS05], we ask whether their positive result can be extended to the standard setting, to prove some new worst-case/average-case connections. While we can not do that unconditionally, we are able to show that under a mild derandomization assumption, the worst-case hardness of NP implies the average-case hardness of NTIME(f(n)) (under the uniform distribution) where fis computable in quasi-polynomial time.