IEEE Transactions on Information Theory
An introduction to symbolic dynamics and coding
An introduction to symbolic dynamics and coding
On the Entropy of a Hidden Markov Process
DCC '04 Proceedings of the Conference on Data Compression
Asymptotics of entropy rate in special families of hidden Markov chains
IEEE Transactions on Information Theory
Analyticity of Entropy Rate of Hidden Markov Chains
IEEE Transactions on Information Theory
A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels
IEEE Transactions on Information Theory
Hi-index | 0.01 |
We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finite-type constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a channel is concave with respect to the parameters of the input Markov processes at high signal-to-noise ratio. In principle, the concavity result enables good numerical approximation of the maximum mutual information rate and capacity of such a channel.