Open problems in communication and computation
Open problems in communication and computation
Elements of information theory
Elements of information theory
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Convex Optimization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Modulation and coding for linear Gaussian channels
IEEE Transactions on Information Theory
Nested linear/lattice codes for structured multiterminal binning
IEEE Transactions on Information Theory
Capacity and lattice strategies for canceling known interference
IEEE Transactions on Information Theory
Capacity With Causal and Noncausal Side Information: A Unified View
IEEE Transactions on Information Theory
Computation Over Multiple-Access Channels
IEEE Transactions on Information Theory
Entropy Amplification Property and the Loss for Writing on Dirty Paper
IEEE Transactions on Information Theory
Interference Alignment and Degrees of Freedom of the -User Interference Channel
IEEE Transactions on Information Theory
Cooperative Multiple-Access Encoding With States Available at One Transmitter
IEEE Transactions on Information Theory
Lattice Strategies for the Dirty Multiple Access Channel
IEEE Transactions on Information Theory
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Hi-index | 754.84 |
For general memoryless systems, the existing information-theoretic solutions have a "single-letter" form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some scalar distribution. Is that the form of the solution of any (information-theoretic) problem? In fact, some counter examples are known. The most famous one is the "two help one" problem: Körner and Marton showed that if we want to decode the modulo-two sum of two correlated binary sources from their independent encodings, then linear coding is better than random coding. In this paper we provide another counter example, the "doubly-dirty" multiple-access channel (MAC). Like the Körner-Marton problem, this is a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference while the receiver only observes the channel output. We give an explicit solution for the capacity region of the binary doubly-dirty MAC, demonstrate how this region can be approached using a linear coding scheme, and prove that the "best known single-letter region" is strictly contained in it. We also state a conjecture regarding the capacity loss of single-letter characterization in the Gaussian case.