On the parallel complexity of computing a maximal independent set in a hypergraph

  • Authors:
  • Pierre Kelsen

  • Affiliations:
  • Department of Computer Sciences, University of Texas, Austin, TX

  • Venue:
  • STOC '92 Proceedings of the twenty-fourth annual ACM symposium on Theory of computing
  • Year:
  • 1992

Quantified Score

Hi-index 0.00

Visualization

Abstract

A maximal independent set in a hypergraph is a subset of vertices that is maximal with respect to the property of not containing any edge of the hypergraph. We show that an algorithm proposed by Beame and Luby is in randomized NC for hypergraphs in which the maximum edge size is bounded by a constant. To prove this, we bound the upper tail of sums of dependent random variables defined on the edges of a hypergraph. These bounds may be viewed as extensions of bounds on the tail of the binomial distribution. We derandomize this algorithm to obtain the first sublinear time deterministic algorithm for hypergraphs with edges of size O(1). The algorithm exhibits the following time-processor tradeoff: it can be made to run in time O(n&egr;) with nO(1/&egr;) processors for a hypergraph on n vertices, for any &egr; ≥ 2d+1• (log log n)/(log n); here d = O(1) denotes the maximum size of an edge in H. In particular, for any constant &egr; O, we have an algorithm running in time O(n&egr;) on a polynomial number of processors, and we have an algorithm running in time (log n)O(1) on nO(log n/log log n) processors.