A Measure of Information

  • Authors:
  • Mark R. Titchener

  • Affiliations:
  • -

  • Venue:
  • DCC '00 Proceedings of the Conference on Data Compression
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Modern information theory is founded on the ideas of Hartley and Shannon, amongst others. From a practitioner's standpoint, Shannon's probabilistic framework carries certain impediments for the practical measurement of information, such as requiring a priori knowledge of a source's characteristics. Moreover, such a statistical formulation of entropy is an asymptotic limit, meaningful only within the context of an ensemble of messages. It thus fails to address the notion of an individual string having information content in and of itself.However, in 1953, Cherry demonstrated that Shannon's entropy could be viewed equivalently as a measure of the average number of selections required identifying each message symbol from the alphabet. Here the terminology contrasts with Shannon's probabilistic formulation, with the process of counting selection steps appearing to be meaningful for individual, isolated, finite strings.We explore this alternative approach in the context of a recursive hierarchical pattern-copying (RHPC) algorithm. We use to measure the complexity of finite strings, in terms of the number of steps required to recursively construct the string from its alphabet. From this we compute an effective rate of steps-per-symbol required for linearly constructing the string.By Cherry's interpretation of Shannon's entropy, we infer this as giving asymptotic equivalence between the two approaches, but perhaps the real significance of this new way to measure information, is its applicability and usefulness in respect of evaluating individual finite strings.