A divergence formula for randomness and dimension

  • Authors:
  • Jack H. Lutz

  • Affiliations:
  • -

  • Venue:
  • Theoretical Computer Science
  • Year:
  • 2011

Quantified Score

Hi-index 5.23

Visualization

Abstract

If S is an infinite sequence over a finite alphabet @S and @b is a probability measure on @S, then the dimension of S with respect to @b, written dim^@b(S), is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension dim(S) when @b is the uniform probability measure. This paper shows that dim^@b(S) and its dual Dim^@b(S), the strong dimension of S with respect to @b, can be used in conjunction with randomness to measure the similarity of two probability measures @a and @b on @S. Specifically, we prove that the divergence formuladim^@b(R)=Dim^@b(R)=H(@a)H(@a)+D(@a@?@b) holds whenever @a and @b are computable, positive probability measures on @S and R@?@S^~ is random with respect to @a. In this formula, H(@a) is the Shannon entropy of @a, and D(@a@?@b) is the Kullback-Leibler divergence between @a and @b. We also show that the above formula holds for all sequences R that are @a-normal (in the sense of Borel) when dim^@b(R) and Dim^@b(R) are replaced by the more effective finite-state dimensions dim"F"S^@b(R) and Dim"F"S^@b(R). In the course of proving this, we also prove finite-state compression characterizations of dim"F"S^@b(S) and Dim"F"S^@b(S).