Reliability criteria in information theory and in statistical hypothesis testing
Foundations and Trends in Communications and Information Theory
Channel coding rate in the finite blocklength regime
IEEE Transactions on Information Theory
Hi-index | 0.06 |
Consider two random variables X and Y, where X is finitely (or countably-infinitely) valued, and where Y is arbitrary. Let ε denote the minimum probability of error incurred in estimating X from Y. It is shown that ε⩾0⩽α⩽1sup (1-α)P(π(X|Y)⩽α) where π(X|Y) denotes the posterior probability of X given Y. This bound finds information-theoretic applications in the proof of converse channel coding theorems. It generalizes and strengthens previous lower bounds due to Shannon, and to Verdu and Han (1994)