Elements of information theory
Elements of information theory
Convex Optimization
Cascading methods for runlength-limited arrays
IEEE Transactions on Information Theory
The capacity and coding gain of certain checkerboard codes
IEEE Transactions on Information Theory
On the capacity of two-dimensional run-length constrained channels
IEEE Transactions on Information Theory
Bounds on the capacity of constrained two-dimensional codes
IEEE Transactions on Information Theory
Parallel constrained coding with application to two-dimensional constraints
IEEE Transactions on Information Theory
Improved bit-stuffing bounds on two-dimensional constraints
IEEE Transactions on Information Theory
Entropy of Bit-Stuffing-Induced Measures for Two-Dimensional Checkerboard Constraints
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The capacity of 1-D constraints is given by the entropy of a corresponding stationary maxentropic Markov chain. Namely, the entropy is maximized over a set of probability distributions, which is defined by some linear requirements. In this paper, certain aspects of this characterization are extended to 2-D constraints. The result is a method for calculating an upper bound on the capacity of 2-D constraints. The key steps are: The maxentropic stationary probability distribution on square configurations is considered. A set of linear equalities and inequalities is derived from this stationarity. The result is a concave program, which can be easily solved numerically. Our method improves upon previous upper bounds for the capacity of the 2-D "no independent bits" constraint, as well as certain 2-D RLL constraints.