On the entropy of a function

  • Authors:
  • Rudolph A. H. Lorentz

  • Affiliations:
  • Fraunhofer Institute for Scientific Computations and Algorithms, Germany and University of Duisburg-Essen, Germany

  • Venue:
  • Journal of Approximation Theory
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A common statement made when discussing the efficiency of compression programs like JPEG is that the transformations used, the discrete cosine or wavelet transform, decorrelate the data. The standard measure used for the information content of the data is the probabilistic entropy. The data can, in this case, be considered as the sampled values of a function. However no sampling independent definition of the entropy of a function has been proposed. Such a definition is given and it is shown that the entropy so defined is the same as the entropy of the sampled data in the limit as the sample spacing goes to zero.