On the combinatorial representation of information

  • Authors:
  • Joel Ratsaby

  • Affiliations:
  • Ben Gurion University of the Negev, Israel

  • Venue:
  • COCOON'06 Proceedings of the 12th annual international conference on Computing and Combinatorics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Kolmogorov introduced a combinatorial measure of the information I(x : y) about the unknown value of a variable y conveyed by an input variable x taking a given value x. The paper extends this definition of information to a more general setting where ‘x = x′ may provide a vaguer description of the possible value of y. As an application, the space ${\cal P}(\{0,1\}^n)$ of classes of binary functions f:[n]→{0,1}, [n] = {1, ..., n}, is considered where y represents an unknown function t∈{0,1}$^{\rm [{\it n}]}$ and as input, two extreme cases are considered: $\mathsf{x} = x_{{\mathcal M}_d}$ and $\mathsf{x} = x_{{\mathcal M'}_d}$ which indicate that t is an element of a set G⊆{0,1}n that satisfies a property ${\mathcal M}_d$ or ${\mathcal M'}_d$ respectively. Property ${\mathcal M}_d$ (or ${\mathcal M'}_d$) means that there exists an E⊆[n], |E| = d, such that |trE(G)| = 1 (or 2d) where trE(G) denotes the trace of G on E. Estimates of the information value $I(x_{{\mathcal M}_d}:t)$ and $I(x_{{\mathcal M'}_d}:t)$ are obtained. When d is fixed, it is shown that $I(x_{{\mathcal M}_d}:t)\approx d$ and $I(x_{{\mathcal M'}_d}:t)\approx 1$ as n→∞.