Inequalities for Shannon entropies and Kolmogorov Complexities

  • Authors:
  • D. Hammer;A. E. Romashchenko;A. Shen;N. K. Vereshchagin

  • Affiliations:
  • -;-;-;-

  • Venue:
  • CCC '97 Proceedings of the 12th Annual IEEE Conference on Computational Complexity
  • Year:
  • 1997
  • The similarity metric

    SODA '03 Proceedings of the fourteenth annual ACM-SIAM symposium on Discrete algorithms

Quantified Score

Hi-index 0.00

Visualization

Abstract

Since the very beginning the notion of complexity of finite objects was considered as an algorithmic counterpart to the notion of Shannon entropy. Kolmogorov's paper (1965) was called "Three approaches to the quantitative definition of information"; Shannon entropy and algorithmic complexity were among these approaches. It was mentioned by Kolmogorov later (1968) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropies and for Kolmogorov complexities. It turns out that (1) all inequalities that are valid for Kolmogorov complexities, are also valid for Shannon entropies and vice versa; (2) all inequalities that are valid for Shannon entropies, are valid for ranks of finite subsets of linear spaces; (3) the opposite statement is not true: Ingleton's inequality (1971) is valid for ranks but not for Shannon entropies; (4) for some special cases all three classes of inequalities coincide and have simple description. We present an inequality for Kolmogorov complexities that implies Ingleton's inequality for ranks; another application of this inequality is a new simple proof of one of Gacs-Korner's results on common information. The paper investigates connections between linear inequalities that are valid for Shannon entropies and for Kolmogorov complexities.