A lower bound on the probability of error in multihypothesis testing

  • Authors:
  • H. V. Poor;S. Verdu

  • Affiliations:
  • Dept. of Electr. Eng., Princeton Univ., NJ;-

  • Venue:
  • IEEE Transactions on Information Theory - Part 2
  • Year:
  • 1995

Quantified Score

Hi-index 0.06

Visualization

Abstract

Consider two random variables X and Y, where X is finitely (or countably-infinitely) valued, and where Y is arbitrary. Let ε denote the minimum probability of error incurred in estimating X from Y. It is shown that ε⩾0⩽α⩽1sup (1-α)P(π(X|Y)⩽α) where π(X|Y) denotes the posterior probability of X given Y. This bound finds information-theoretic applications in the proof of converse channel coding theorems. It generalizes and strengthens previous lower bounds due to Shannon, and to Verdu and Han (1994)