Elements of information theory
Elements of information theory
Algorithms for random generation and counting: a Markov chain approach
Algorithms for random generation and counting: a Markov chain approach
Geographic gossip: efficient aggregation for sensor networks
Proceedings of the 5th international conference on Information processing in sensor networks
Computing separable functions via gossip
Proceedings of the twenty-fifth annual ACM symposium on Principles of distributed computing
IEEE/ACM Transactions on Networking (TON) - Special issue on networking and information theory
Mathematical aspects of mixing times in Markov chains
Foundations and Trends® in Theoretical Computer Science
Lower Bounds for the Noisy Broadcast Problem
SIAM Journal on Computing
SFCS '84 Proceedings of the 25th Annual Symposium onFoundations of Computer Science, 1984
Sensor Networks With Random Links: Topology Design for Distributed Consensus
IEEE Transactions on Signal Processing - Part II
Distributed Average Consensus With Dithered Quantization
IEEE Transactions on Signal Processing - Part I
Coding With Side Information for Rate-Constrained Consensus
IEEE Transactions on Signal Processing - Part I
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Computation Over Multiple-Access Channels
IEEE Transactions on Information Theory
Toward a theory of in-network computation in wireless sensor networks
IEEE Communications Magazine
Brief paper: Distributed averaging on digital erasure networks
Automatica (Journal of IFAC)
The cost of fault tolerance in multi-party communication complexity
PODC '12 Proceedings of the 2012 ACM symposium on Principles of distributed computing
Hi-index | 754.84 |
A network of nodes communicate via point-to-point memoryless independent noisy channels. Each node has some real-valued initial measurement or message. The goal of each of the nodes is to acquire an estimate of a given function of all the initial measurements in the network. As the main contribution of this paper, a lower bound on computation time is derived. This bound must be satisfied by any algorithm used by the nodes to communicate and compute, so that the mean-square error in the nodes' estimate is within a given interval around zero. The derivation utilizes information theoretic inequalities reminiscent of those used in rate distortion theory along with a novel "perturbation" technique so as to be broadly applicable. To understand the tightness of the bound, a specific scenario is considered. Nodes are required to learn a linear combination of the initial values in the network while communicating over erasure channels. A distributed quantized algorithm is developed, and it is shown that the computation time essentially scales as is implied by the lower bound. In particular, the computation time depends reciprocally on "conductance", which is a property of the network that captures the information-flow bottleneck. As a by-product, this leads to a quantized algorithm, for computing separable functions in a network, with minimal computation time.