On the entropy of a hidden Markov process
Theoretical Computer Science
On analytic properties of entropy rate
IEEE Transactions on Information Theory
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Asymptotics of entropy rate in special families of hidden Markov chains
IEEE Transactions on Information Theory
Noisy constrained capacity for BSC channels
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Hi-index | 755.08 |
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions, the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters