Principles of Digital Communication and Coding
Principles of Digital Communication and Coding
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Universal decoding for channels with memory
IEEE Transactions on Information Theory
On the universality of the LZ-based decoding algorithm
IEEE Transactions on Information Theory
Universal composite hypothesis testing: a competitive minimax approach
IEEE Transactions on Information Theory
Minimax Universal Decoding With an Erasure Option
IEEE Transactions on Information Theory
Achievable Error Exponents for the Private Fingerprinting Game
IEEE Transactions on Information Theory
Hi-index | 754.84 |
Universally achievable error exponents pertaining to certain families of channels (most notably, discrete memoryless channels (DMCs) and various ensembles of random codes, are studied by combining the competitive minimax approach, proposed by Feder and Merhav, with Chernoff bound and Gallager's techniques for the analysis of error exponents. In particular, we derive a single-letter expression for the largest, universally achievable fraction ξ of the optimum error exponent pertaining to the optimum maximum-likelihood (ML) decoding. Moreover, a simpler single-letter expression for a lower bound to ξ is presented. To demonstrate the tightness of this lower bound, we use it to show that ξ = 1, for the binary symmetric channel (BSC), when the random coding distribution is uniform over: i) all codes (of a given rate), and ii) all linear codes, in agreement with well-known results. We also show that ξ =1 for the uniform ensemble of systematic linear codes, and for that of time-varying convolutional codes in the bit-error-rate sense. For the latter case, we also derive the corresponding universal decoder explicitly and show how it can be efficiently implemented using a slightly modified version of the Viterbi algorithm which employs two trellises.