Information measures for infinite sequences
Theoretical Computer Science
On the optimal compression of sets in PSPACE
FCT'11 Proceedings of the 18th international conference on Fundamentals of computation theory
Hi-index | 0.00 |
Under a standard hardness assumption we exactly characterize the worst-case running time of languages that are in average polynomial-time over all polynomial-time samplable distributions. More precisely we show that if exponential time is not infinitely often in subexponential space, then the following are equivalent for any algorithm $A$: \begin{itemize} \item For all $\p$-samplable distributions $\mu$, $A$ runs in time polynomial on $\mu$-average. \item For all polynomial $p$, the running time for A is bounded by $2^{O(K^p(x)-K(x)+\log(|x|))}$ for \emph{all} inputs $x$. \end{itemize} where $K(x)$ is the Kolmogorov complexity (size of smallest program generating $x$) and $K^p(x)$ is the size of the smallest program generating $x$ within time $p(|x|)$. To prove this result we show that, under the hardness assumption, the polynomial-time Kolmogorov distribution, $m^p(x)=2^{-K^p(x)}$, is universal among the P-samplable distributions.