Entropic measures, Markov information sources and complexity

  • Authors:
  • Cristian S. Calude;Monica Dumitrescu

  • Affiliations:
  • Department of Computer Science, University of Auckland, Private Bag 92109, Auckland, New Zealand;Faculty of Mathematics, University of Bucharest, Str, Academiei 14, R-70109 Bucharest, Romania

  • Venue:
  • Applied Mathematics and Computation
  • Year:
  • 2002

Quantified Score

Hi-index 0.48

Visualization

Abstract

The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for discrete or continuous Markov sources, with finite or continuous alphabets, and their relations to program-size complexity and algorithmic probability. The accent is on ideas, constructions and results; no proofs will be given.