Algorithmic information theory
Algorithmic information theory
Elements of information theory
Elements of information theory
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Kolmogorov's Structure Functions with an Application to the Foundations of Model Selection
FOCS '02 Proceedings of the 43rd Symposium on Foundations of Computer Science
ISAAC '02 Proceedings of the 13th International Symposium on Algorithms and Computation
Quality and Quantity of Information Exchange
Journal of Logic, Language and Information
IEEE Transactions on Information Theory
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Clustering with Bregman Divergences
The Journal of Machine Learning Research
The cost of offline binary search tree algorithms and the complexity of the request sequence
Theoretical Computer Science
Comparing Notions of Computational Entropy
CiE '07 Proceedings of the 3rd conference on Computability in Europe: Computation and Logic in the Real World
Journal of Combinatorial Theory Series B
Biological information as set-based complexity
IEEE Transactions on Information Theory - Special issue on information theory in molecular biology and neuroscience
Causal inference using the algorithmic Markov condition
IEEE Transactions on Information Theory
Complex adaptive systems: using a free-market simulation to estimate attribute relevance
ISMIS'06 Proceedings of the 16th international conference on Foundations of Intelligent Systems
Hi-index | 0.06 |
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, and we explain howit relates to sequential question-answer sessions.