Symmetry of information: a closer look

  • Authors:
  • Marius Zimand

  • Affiliations:
  • Department of Computer and Information Sciences, Towson University, Baltimore, MD

  • Venue:
  • WTCS'12 Proceedings of the 2012 international conference on Theoretical Computer Science: computation, physics and beyond
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Symmetry of information establishes a relation between the information that x has about y (denoted I(x : y)) and the information that y has about x (denoted I(y : x)). In classical information theory, the two are exactly equal, but in algorithmical information theory, there is a small excess quantity of information that differentiates the two terms, caused by the necessity of packaging information in a way that makes it accessible to algorithms. It was shown in [Zim11] that in the case of strings with simple complexity (that is the Kolmogorov complexity of their Kolmogorov complexity is small), the relevant information can be packed in a very economical way, which leads to a tighter relation between I(x : y) and I(y : x) than the one provided in the classical symmetry-of-information theorem of Kolmogorov and Levin. We give here a simpler proof of this result.