Information bounds of the Fano-Kullback type

  • Authors:
  • R. Blahut

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

A large class of lower bounds relating to the performance of hypothesis testers, channel codes, and source compression codes is developed. These are extensions of Fano's inequality on the one hand, and of the discrimination inequality of Kullback on the other. The hypothesis testing and channel coding bounds are interesting primarily for small blocklengths and, in general, are asymptotically inferior to the well-known exponentially decreasing bounds. The source compression results include new proofs of converse coding theorems. A lower bound is given to the probability that a source produces an output block which cannot be encoded within a desired maximum distortion.