The amount of information that y gives about X

  • Authors:
  • N. Blachman

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

No single measureM(X;y)of the amount of information that a specific valueyof a random variableYgives about another random variableXhas all of the desirable properties possessed by Shannon's measureI(X;Y) = E{M(X;y)}of the average mutual information ofXandY. It is shown that one of these properties (additivity) determines one particular form forM(X;y), while others (non-negativity or coordinate independance) determine a different form. The latter, which is the more useful and accepted information measure, is thus seen to be unique.