Elements of information theory
Elements of information theory
A generalized suffix tree and its (un)expected asymptotic behaviors
SIAM Journal on Computing
Average Case Analysis of Algorithms on Sequences
Average Case Analysis of Algorithms on Sequences
On the Entropy of a Hidden Markov Process
DCC '04 Proceedings of the Conference on Data Compression
Entropy computations via analytic depoissonization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Decision making in Markov chains applied to the problem of pattern recognition
IEEE Transactions on Information Theory
Optimal decoding of linear codes for minimizing symbol error rate (Corresp.)
IEEE Transactions on Information Theory
Design of a linguistic statistical decoder for the recognition of continuous speech
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Universal discrete denoising: known channel
IEEE Transactions on Information Theory
Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices
IEEE Transactions on Information Theory
Analyticity of Entropy Rate of Hidden Markov Chains
IEEE Transactions on Information Theory
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences: an International Journal
Rényi entropy rate for Gaussian processes
Information Sciences: an International Journal
Noisy constrained capacity for BSC channels
IEEE Transactions on Information Theory
Hi-index | 5.33 |
We study the entropy rate of a hidden Markov process (HMP) defined by observing the output of a binary symmetric channel whose input is a first-order binary Markov process. Despite the simplicity of the models involved, the characterization of this entropy is a long standing open problem. By presenting the probability of a sequence under the model as a product of random matrices, one can see that the entropy rate sought is equal to a top Lyapunov exponent of the product. This offers an explanation for the elusiveness of explicit expressions for the HMP entropy rate, as Lyapunov exponents are notoriously difficult to compute. Consequently, we focus on asymptotic estimates, and apply the same product of random matrices to derive an explicit expression for a Taylor approximation of the entropy rate with respect to the parameter of the binary symmetric channel. The accuracy of the approximation is validated against empirical simulation results. We also extend our results to higher-order Markov processes and to Renyi entropies of any order.