Hypothesis testing and information theory

  • Authors:
  • R. Blahut

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

The testing of binary hypotheses is developed from an information-theoretic point of view, and the asymptotic performance of optimum hypothesis testers is developed in exact analogy to the asymptotic performance of optimum channel codes. The discrimination, introduced by Kullback, is developed in a role analogous to that of mutual information in channel coding theory. Based on the discrimination, an error-exponent functione(r)is defined. This function is found to describe the behavior of optimum hypothesis testers asymptotically with block length. Next, mutual information is introduced as a minimum of a set of discriminations. This approach has later coding significance. The channel reliability-rate functionE(R)is defined in terms of discrimination, and a number of its mathematical properties developed. Sphere-packing-like bounds are developed in a relatively straightforward and intuitive manner by relatinge(r)andE (R). This ties together the aforementioned developments and gives a lower bound in terms of a hypothesis testing model. The result is valid for discrete or continuous probability distributions. The discrimination function is also used to define a source code reliability-rate function. This function allows a simpler proof of the source coding theorem and also bounds the code performance as a function of block length, thereby providing the source coding analog ofE (R).