On the loss of single-letter characterization: the dirty multiple access channel

  • Authors:
  • Tal Philosof;Ram Zamir

  • Affiliations:
  • Department of Electrical Engineering-Systems, Tel-Aviv University, Ramat-Aviv, Tel-Aviv, Israel;Department of Electrical Engineering-Systems, Tel-Aviv University, Ramat-Aviv, Tel-Aviv, Israel

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2009

Quantified Score

Hi-index 754.84

Visualization

Abstract

For general memoryless systems, the existing information-theoretic solutions have a "single-letter" form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some scalar distribution. Is that the form of the solution of any (information-theoretic) problem? In fact, some counter examples are known. The most famous one is the "two help one" problem: Körner and Marton showed that if we want to decode the modulo-two sum of two correlated binary sources from their independent encodings, then linear coding is better than random coding. In this paper we provide another counter example, the "doubly-dirty" multiple-access channel (MAC). Like the Körner-Marton problem, this is a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference while the receiver only observes the channel output. We give an explicit solution for the capacity region of the binary doubly-dirty MAC, demonstrate how this region can be approached using a linear coding scheme, and prove that the "best known single-letter region" is strictly contained in it. We also state a conjecture regarding the capacity loss of single-letter characterization in the Gaussian case.