Elements of information theory
Elements of information theory
Theory of Information and Coding
Theory of Information and Coding
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Introduction to Probability Models, Ninth Edition
Introduction to Probability Models, Ninth Edition
A First Course in Information Theory (Information Technology: Transmission, Processing and Storage)
A First Course in Information Theory (Information Technology: Transmission, Processing and Storage)
On information divergence measures and a unified typicality
IEEE Transactions on Information Theory
On the interplay between conditional entropy and error probability
IEEE Transactions on Information Theory
A new metric for probability distributions
IEEE Transactions on Information Theory
On information divergence measures and a unified typicality
IEEE Transactions on Information Theory
The interplay between entropy and variational distance
IEEE Transactions on Information Theory
On the interplay between conditional entropy and error probability
IEEE Transactions on Information Theory
Some properties of Rényi entropy over countably infinite alphabets
Problems of Information Transmission
Hi-index | 755.02 |
The Shannon information measures are well known to be continuous functions of the probability distribution for a given finite alphabet. In this paper, however, we show that these measures are discontinuous with respect to almost all commonly used "distance" measures when the alphabet is countably infinite. Such "distance" measures include the Kullback-Leibler divergence and the variational distance. Specifically, we show that all the Shannon information measures are in fact discontinuous at all probability distributions. The proofs are based on a probability distribution which can be realized by a discrete-time Markov chain with countably infinite number of states. Our findings reveal that the limiting probability distribution may not fully characterize the asymptotic behavior of a Markov chain. These results explain why certain existing information-theoretical tools are restricted to finite alphabets, and provide hints on how these tools can be extended to countably infinite alphabet.