An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
A Kolmogorov complexity characterization of constructive Hausdorff dimension
Information Processing Letters
Dimension in Complexity Classes
SIAM Journal on Computing
The dimensions of individual strings and sequences
Information and Computation
Theoretical Computer Science
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Effective Strong Dimension in Algorithmic Information and Computational Complexity
SIAM Journal on Computing
Dimensions of Points in Self-Similar Fractals
SIAM Journal on Computing
Hi-index | 0.00 |
If S is an infinite sequence over a finite alphabet Σ and β is a probability measure on Σ, then the dimension of S with respect to β , written $\dim^\beta(S)$, is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension dim(S ) when β is the uniform probability measure. This paper shows that dim β (S ) and its dual Dim β (S ), the strong dimension of S with respect to β , can be used in conjunction with randomness to measure the similarity of two probability measures *** and β on Σ. Specifically, we prove that the divergence formula $$ {\mathrm {dim}}^\beta(R) = {\mathrm{Dim}}^\beta(R) =\frac{{\mathcal{H}}(\alpha)}{{\mathcal{H}}(\alpha) + {\mathcal{D}}(\alpha || \beta)} $$ holds whenever *** and β are computable, positive probability measures on Σ and R *** Σ *** is random with respect to *** . In this formula, ${\mathcal{H}}(\alpha)$ is the Shannon entropy of *** , and ${\mathcal{D}}(\alpha||\beta)$ is the Kullback-Leibler divergence between *** and β .