Mean Entropies

  • Authors:
  • B. H. Lavenda

  • Affiliations:
  • Universitá degli Studi, Camerino, Italy 62032 (MC)

  • Venue:
  • Open Systems & Information Dynamics
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Entropies are expressed in terms of mean values, and not as weighted arithmetic means of their generating functions, which result in pseudo-additive entropies. The Shannon entropy corresponds to the logarithm of the inverse of the geometric mean, while the Rényi entropy, more generally, to the logarithm of the inverse of power means of order 驴 驴 1. Translation invariance of the means relates to mean code lengths, while their homogeneity translates them into entropies: the arithmetic and exponential means correspond to the Shannon and Rényi entropies, respectively, under the Kraft equality. While under the Kraft inequality, the entropies are lower bounds to the mean code lengths. Means of any order cannot be expressed as escort averages because such averages contradict the fact that the means are monotonically increasing functions of their order. Exponential entropies are shown to be measures of the extent of a distribution. The probability measure and the incomplete probability distribution are shown to be the ranges of continuous and discrete sample spaces, respectively. Comparison is made with Boltzmann's principle.