Algorithms for distributed functional monitoring
Proceedings of the nineteenth annual ACM-SIAM symposium on Discrete algorithms
Minimum Entropy Combinatorial Optimization Problems
CiE '09 Proceedings of the 5th Conference on Computability in Europe: Mathematical Theory and Computational Practice
When do only sources need to compute? on functional compression in tree networks
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Distributed transmission of functions of correlated sources over a fading multiple access channel
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Distributed joint source-channel coding for functions over a multiple access channel
GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
Multi-functional compression with side information
GLOBECOM'09 Proceedings of the 28th IEEE conference on Global telecommunications
A genetic algorithm to minimize chromatic entropy
EvoCOP'10 Proceedings of the 10th European conference on Evolutionary Computation in Combinatorial Optimization
Hi-index | 0.00 |
We consider the distributed computation of a function of random sources with minimal communication. Specifically, given two discrete memoryless sources, X and Y , a receiver wishes to compute f(X, Y ) based on (encoded) information sent from X and Y in a distributed manner. A special case, f(X, Y ) = (X, Y ), is the classical question of distributed source coding considered by Slepian and Wolf (1973). Orlitsky and Roche (2001) considered a somewhat restricted setup when Y is available as side information at the receiver. They characterized the minimal rate at which X needs to transmit data to the receiver as the conditional graph entropy of the characteristic graph of X based on f. In our recent work (2006), we further established that this minimal rate can be achieved by means of graph coloring and distributed source coding (e.g. Slepian-Wolf coding). This characterization allows for the separation between "function coding" and "correlation coding." In this paper, we consider a more general setup where X and Y are both encoded (separately). This is a significantly harder setup for which to give a single-letter characterization for the complete rate region. We find that under a certain condition on the support set of X and Y (called the zigzag condition), it is possible to characterize the rate region based on graph colorings at X and Y separately. That is, any achievable pair of rates can be realized by means of first coloring graphs at X and Y separately (function coding) and then using Slepian-Wolf coding for these colors (correlation coding). We also obtain a single-letter characterization of the minimal joint rate. Finally, we provide simulation results based on graph coloring to establish the rate gains on real sequences.