Information theoretic bounds for distributed computation over networks of point-to-point channels

  • Authors:
  • Ola Ayaso;Devavrat Shah;Munther A. Dahleh

  • Affiliations:
  • Georgia Institute of Technology, Atlanta, GA and Massachusetts Institute of Technology, Cambridge, MA;Massachusetts Institute of Technology, Cambridge, MA;Massachusetts Institute of Technology, Cambridge, MA

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2010

Quantified Score

Hi-index 754.84

Visualization

Abstract

A network of nodes communicate via point-to-point memoryless independent noisy channels. Each node has some real-valued initial measurement or message. The goal of each of the nodes is to acquire an estimate of a given function of all the initial measurements in the network. As the main contribution of this paper, a lower bound on computation time is derived. This bound must be satisfied by any algorithm used by the nodes to communicate and compute, so that the mean-square error in the nodes' estimate is within a given interval around zero. The derivation utilizes information theoretic inequalities reminiscent of those used in rate distortion theory along with a novel "perturbation" technique so as to be broadly applicable. To understand the tightness of the bound, a specific scenario is considered. Nodes are required to learn a linear combination of the initial values in the network while communicating over erasure channels. A distributed quantized algorithm is developed, and it is shown that the computation time essentially scales as is implied by the lower bound. In particular, the computation time depends reciprocally on "conductance", which is a property of the network that captures the information-flow bottleneck. As a by-product, this leads to a quantized algorithm, for computing separable functions in a network, with minimal computation time.