Information Theory, Relative Entropy and Statistics

  • Authors:
  • François Bavaud

  • Affiliations:
  • University of Lausanne, Switzerland

  • Venue:
  • Formal Theories of Information
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Shannon's Information Theory (IT) (1948) definitely established the purely mathematical nature of entropy and relative entropy, in contrast to the previous identification by Boltzmann (1872) of his "H -functional" as the physical entropy of earlier thermodynamicians (Carnot, Clausius, Kelvin). The following recounting is attributed to Shannon (Tribus and McIrvine 1971): My greatest concern was what to call it. I thought of calling it "information", but the word was overly used, so I decided to call it "uncertainty". When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."