Distributed (δ+1)-coloring in linear (in δ) time

  • Authors:
  • Leonid Barenboim;Michael Elkin

  • Affiliations:
  • Ben-Gurion University of the Negev, Beer-Sheva, Israel;Ben-Gurion University of the Negev, Beer-Sheva, Israel

  • Venue:
  • Proceedings of the forty-first annual ACM symposium on Theory of computing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The distributed (Δ + 1)-coloring problem is one of most fundamental and well-studied problems in Distributed Algorithms. Starting with the work of Cole and Vishkin in 86, there was a long line of gradually improving algorithms published. The current state-of-the-art running time is O(Δ log Δ + log* n), due to Kuhn and Wattenhofer, PODC'06. Linial (FOCS'87) has proved a lower bound of 1/2 log* n for the problem, and Szegedy and Vishwanathan (STOC'93) provided a heuristic argument that shows that algorithms from a wide family of locally iterative algorithms are unlikely to achieve running time smaller than Θ(Δ log Δ). We present a deterministic (Δ + 1)-coloring distributed algorithm with running time O(Δ) + 1/2 log* n. We also present a tradeoff between the running time and the number of colors, and devise an O(Δ • t)-coloring algorithm with running time O(Δ / t + log* n), for any parameter t, 1 1-ε, for an arbitrarily small constant ε, 0