Kolmogorov Complexity and Information Theory.With an Interpretation in Terms of Questions and Answers

  • Authors:
  • Peter D. Grü/nwald;Paul M. B. Vitá/nyi

  • Affiliations:
  • CWI, P.O. Box 94079, NL-1090 GB Amsterdam, The Netherlands/ E-mail: pdg@cwi.nl;CWI, P.O. Box 94079, NL-1090 GB Amsterdam, The Netherlands/ E-mail: paulv@cwi.nl

  • Venue:
  • Journal of Logic, Language and Information
  • Year:
  • 2003

Quantified Score

Hi-index 0.06

Visualization

Abstract

We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, and we explain howit relates to sequential question-answer sessions.