On the capacity of finite state channels and the analysis of convolutional accumulate-m codes
On the capacity of finite state channels and the analysis of convolutional accumulate-m codes
Measure Theory and Probability Theory (Springer Texts in Statistics)
Measure Theory and Probability Theory (Springer Texts in Statistics)
IEEE Transactions on Information Theory
Representation of Mutual Information Via Input Estimates
IEEE Transactions on Information Theory
Limited-rate channel state feedback for multicarrier block fading channels
IEEE Transactions on Information Theory
Hi-index | 754.90 |
This paper studies the entropy rate of hidden Markov processes (HMPs) which are generated by observing a discrete-time binary homogeneous Markov chain through an arbitrary memoryless channel. A fixed-point functional equation is derived for the stationary distribution of an input symbol conditioned on all past observations. While the existence of a solution to the fixed-point functional equation is guaranteed by martingale theory, its uniqueness follows from the fact that the solution is the fixed point of a contraction mapping. The entropy or differential entropy rate of the HMP can then be obtained through computing the average entropy of each input symbol conditioned on past observations. In absence of an analytical solution to the fixed-point functional equation, a numerical method is proposed in which the fixed-point functional equation is first converted to a discrete linear system using uniform quantization and then solved efficiently. The accuracy of the computed entropy rate is shown to be proportional to the quantization interval. Unlike many other numerical methods, this numerical solution is not based on averaging over a sample path of the HMP.