Appendix: on common information and related characteristics of correlated information sources

  • Authors:
  • R. Ahlswede;J. Körner

  • Affiliations:
  • Fakultät für Mathematik, Universität Bielefeld, Bielefeld, Germany;-

  • Venue:
  • General Theory of Information Transfer and Combinatorics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This is a literal copy of a manuscript from 1974. References have been updated. It contains a critical discussion of in those days recent concepts of “common information” and suggests also alternative definitions. (Compare pages 402–405 in the book by I. Csiszár, J. Körner “Information Theory: Coding Theorems for Discrete Memoryless Systems”, Akademiai Kiado, Budapest 1981.) One of our definitions gave rise to the now well–known source coding problem for two helpers (formulated in 2.) on page 7). More importantly, an extension of one concept to “common information with list knowledge” has recently (R. Ahlswede and V. Balakirsky “Identification under Random Processes” invited paper in honor of Mark Pinsker, Sept. 1995) turned out to play a key role in analyzing the contribution of a correlated source to the identification capacity of a channel. Thus the old ideas have led now to concepts of operational significance and therefore are made accessible here.