Generalizing the Fano inequality

  • Authors:
  • Te Sun Han;S. Verdu

  • Affiliations:
  • Program in Inf. Sci., Univ. of Electro-Commun., Tokyo;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. The authors show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this ran be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilities