Recursively enumerable sets and degrees
Recursively enumerable sets and degrees
Algorithmic information theory
Algorithmic information theory
Information randomness & incompleteness: papers on algorithmic information theory (2nd ed.)
Information randomness & incompleteness: papers on algorithmic information theory (2nd ed.)
Elements of information theory
Elements of information theory
A proof of the Beyer-Stein-Ulam relation between complexity and entropy
Discrete Mathematics
The unknowable
Exploring randomness
Information and Randomness: An Algorithmic Perspective
Information and Randomness: An Algorithmic Perspective
On degrees of randomness and genetic randomness
WTCS'12 Proceedings of the 2012 international conference on Theoretical Computer Science: computation, physics and beyond
Hi-index | 0.48 |
The concept of entropy plays a major part in communication theory. The Shannon entropy is a measure of uncertainty with respect to a priori probability distribution. In algorithmic information theory the information content of a message is measured in terms of the size in bits of the smallest program for computing that message. This paper discusses the classical entropy and entropy rate for discrete or continuous Markov sources, with finite or continuous alphabets, and their relations to program-size complexity and algorithmic probability. The accent is on ideas, constructions and results; no proofs will be given.