Random matrix theory and wireless communications
Communications and Information Theory
An upper bound for the largest Lyapunov exponent of a Markovian product of nonnegative matrices
Theoretical Computer Science
On the entropy of a hidden Markov process
Theoretical Computer Science
On analytic properties of entropy rate
IEEE Transactions on Information Theory
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 3
Asymptotics of entropy rate in special families of hidden Markov chains
IEEE Transactions on Information Theory
Noisy constrained capacity for BSC channels
IEEE Transactions on Information Theory
Hi-index | 0.18 |
We study the entropy rate of a binary hidden Markov process (HMP) definedby observing the output of a binary symmetric channel whose input is a first-order binaryMarkov process. Despite the simplicity of the models involved, the characterization of thisentropy is a long standing open problem. By presenting the probability of a sequence underthe model as a product of random matrices, we show that the entropy rate sought is a topLyapunov exponent of the product, which explains the difficulty in its explicit computation.We apply the same product of random matrices to derive an explicit expression for a firstorder Taylor approximation of the entropy rate with respect to the parameter of the binary symmetric channel. The accuracy of the approximation is validated against empiricalsimulation results. We also extend our results to Rényi's entropy of any order.