Entropy and information theory
Entropy and information theory
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Mathematics of Information and Coding
Mathematics of Information and Coding
On the Entropy of a Hidden Markov Process
DCC '04 Proceedings of the Conference on Data Compression
IEEE Transactions on Information Theory
On the optimality of symbol-by-symbol filtering and denoising
IEEE Transactions on Information Theory
Analyticity of Entropy Rate of Hidden Markov Chains
IEEE Transactions on Information Theory
Asymptotic Mean Stationarity of Sources With Finite Evolution Dimension
IEEE Transactions on Information Theory
Derivatives of Entropy Rate in Special Families of Hidden Markov Chains
IEEE Transactions on Information Theory
Characterization of ergodic hidden Markov sources
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Hi-index | 754.96 |
Entropy rate is a real valued functional on the space of discrete random sources for which it exists. However, it lacks existence proofs and/or closed formulas even for classes of random sources which have intuitive parameterizations. A good way to overcome this problem is to examine its analytic properties relative to some reasonable topology. A canonical choice of a topology is that of the norm of total variation as it immediately arises with the idea of a discrete random source as a probability measure on sequence space. It is shown that both upper and lower entropy rate, hence entropy rate itself if it exists, are Lipschitzian relative to this topology, which, by well known facts, is close to differentiability. An application of this theorem leads to a simple and elementary proof of the existence of entropy rate of random sources with finite evolution dimension. This class of sources encompasses arbitrary hidden Markov sources and quantum random walks.