IEEE Transactions on Information Theory
Average Case Analysis of Algorithms on Sequences
Average Case Analysis of Algorithms on Sequences
On the Entropy of a Hidden Markov Process
DCC '04 Proceedings of the Conference on Data Compression
An upper bound for the largest Lyapunov exponent of a Markovian product of nonnegative matrices
Theoretical Computer Science
On the entropy of a hidden Markov process
Theoretical Computer Science
Entropy computations via analytic depoissonization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the optimality of symbol-by-symbol filtering and denoising
IEEE Transactions on Information Theory
Simulation-Based Computation of Information Rates for Channels With Memory
IEEE Transactions on Information Theory
Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices
IEEE Transactions on Information Theory
Analyticity of Entropy Rate of Hidden Markov Chains
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Hi-index | 754.84 |
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose input is a sequence from a constrained set. As stated by Fan et al., "... while calculation of the noise-free capacity of constrained sequences is well known, the computation of the capacity of a constraint in the presence of noise... has been an unsolved problem in the half-century since Shannon's landmark paper." We first express the constrained capacity of a binary symmetric channel with (d, k-constrained input as a limit of the top Lyapunov exponents of certain matrix random processes. Then, we compute asymptotic approximations of the noisy constrained capacity for cases where the noise parameter ε is small. In particular, we show that when K≤2d, the error term (excess of capacity beyond the noise-free capacity) is O(ε), whereas it is O(ε log ε) when k 2d. In both cases, we compute the coefficient of the error term. In the course of establishing these findings, we also extend our previous results on the entropy of a hidden Markov process to higher-order finite memory processes. These conclusions are proved by a combination of analytic and combinatorial methods.