Toward a theory of maximally concurrent programs (shortened version)

  • Authors:
  • Rajeev Joshi;Jayadev Misra

  • Affiliations:
  • Compaq Systems Research Center, Palo Alto, CA;The University of Texas at Austin, Austin, TX

  • Venue:
  • Proceedings of the nineteenth annual ACM symposium on Principles of distributed computing
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

Typically, program design involves constructing a program P that implements a given specification S; that is, the set P of executions of P is a subset of the set S of executions satisfying S. In many cases, we seek a program P that not only implements S, but for which P = S. Then, every execution satisfying the specification is a possible execution of the program; we then call P maximal for the specification S. We argue that maximality is an important criterion in the context of designing concurrent programs because it disallows implementations that do not exhibit enough concurrency. In addition, a maximal solution can serve as a basis for deriving a variety of implementations, each appropriate for execution on a specific computing platform.This paper also describes a method for proving the maximality of a program with respect to a given specification. Even though we prove facts about possible executions of programs, there is no need to appeal to branching time logics; we employ a fragment of linear temporal logic for our proofs. The method results in concise proofs of maximality for several non-trivial examples. The method may also serve as a guide in constructing maximal programs.