Information Efficiency

  • Authors:
  • Joel Ratsaby

  • Affiliations:
  • Ben Gurion University of the Negev, Israel

  • Venue:
  • SOFSEM '07 Proceedings of the 33rd conference on Current Trends in Theory and Practice of Computer Science
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Shannon's theory of information stands on a probabilistic representation of events that convey information, e.g., sending messages over a communication channel. Kolmogorov argues that information is a more fundamental concept which exists also in problems with no underlying stochastic model, for instance, the information contained in an algorithm or in the genome. In a classic paper he defines the discrete entropy of a finite set which establishes a combinatorial based definition of the information I(x:y) conveyed by a variable x(taking a binary string value x) about the unknown value of a variable y. The current paper extends Kolmogorov's definition of information to a more general setting where given `x= x' there may still be uncertainty about the set of possible values of y. It then establishes a combinatorial based description complexity of xand introduces a novel concept termed information width, similar to n-widths in approximation theory. This forms the basis of new measures of cost and efficiency of information which give rise to a new framework whereby information of any input source, e.g., sample-based, general side-information or a hybrid of both, is represented and computed according to a single common formula. As an application, we consider the space of Boolean functions where input strings xcorrespond to descriptions of properties of classes of Boolean functions.