Lower Bounds for the Noisy Broadcast Problem

  • Authors:
  • Navin Goyal;Guy Kindler;Michael Saks

  • Affiliations:
  • Dept. of Computer Science Rutgers University;Princeton University;Dept. of Mathematics Rutgers University

  • Venue:
  • FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We prove the first non-trivial (superlinear) lower bound in the noisy broadcast model of distributed computation. In this model, there are n + 1 processors P0, P1,..., Pn. Each Pi, for i\ge 1, initially has a private bit xi and thegoal is for P0 to learn f(x1, . . . , xn) for some specified function f. At each time step, a designated processor broadcasts some function of its private bit and the bits it has heard so far. Each broadcast is received by the other processors but each reception may be corrupted by noise. In this model, Gallager [16] gave a noise-resistant protocol that allows P0 to learn the entire input in O(n log log n) broadcasts. We prove that Gallager驴s protocol is optimal up to a constant factor.Our lower bound follows from a lower bound in a new model, the generalized noisy decision tree model, which may be of independent interest.