Cumulative residual entropy: a new measure of information

  • Authors:
  • Murali Rao;Y. Chen;B. C. Vemuri;Fei Wang

  • Affiliations:
  • Dept. of Math., Univ. of Florida, Gainesville, FL, USA;-;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2004

Quantified Score

Hi-index 754.84

Visualization

Abstract

In this paper, we use the cumulative distribution of a random variable to define its information content and thereby develop an alternative measure of uncertainty that extends Shannon entropy to random variables with continuous distributions. We call this measure cumulative residual entropy (CRE). The salient features of CRE are as follows: 1) it is more general than the Shannon entropy in that its definition is valid in the continuous and discrete domains, 2) it possesses more general mathematical properties than the Shannon entropy, and 3) it can be easily computed from sample data and these computations asymptotically converge to the true values. The properties of CRE and a precise formula relating CRE and Shannon entropy are given in the paper. Finally, we present some applications of CRE to reliability engineering and computer vision.