A Divergence Formula for Randomness and Dimension

  • Authors:
  • Jack H. Lutz

  • Affiliations:
  • Department of Computer Science, Iowa State University, Ames, USA 50011

  • Venue:
  • CiE '09 Proceedings of the 5th Conference on Computability in Europe: Mathematical Theory and Computational Practice
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

If S is an infinite sequence over a finite alphabet Σ and β is a probability measure on Σ, then the dimension of S with respect to β , written $\dim^\beta(S)$, is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension dim(S ) when β is the uniform probability measure. This paper shows that dim β (S ) and its dual Dim β (S ), the strong dimension of S with respect to β , can be used in conjunction with randomness to measure the similarity of two probability measures *** and β on Σ. Specifically, we prove that the divergence formula $$ {\mathrm {dim}}^\beta(R) = {\mathrm{Dim}}^\beta(R) =\frac{{\mathcal{H}}(\alpha)}{{\mathcal{H}}(\alpha) + {\mathcal{D}}(\alpha || \beta)} $$ holds whenever *** and β are computable, positive probability measures on Σ and R *** Σ *** is random with respect to *** . In this formula, ${\mathcal{H}}(\alpha)$ is the Shannon entropy of *** , and ${\mathcal{D}}(\alpha||\beta)$ is the Kullback-Leibler divergence between *** and β .