The design of a standard message passing interface for distributed memory concurrent computers
Parallel Computing - Special issue: message passing interfaces
Deriving structured parallel implementations for numerical methods
Microprocessing and Microprogramming - Special double issue: parallel systems engineering
Parallel computation: models and methods
Parallel computation: models and methods
Models and languages for parallel computation
ACM Computing Surveys (CSUR)
MPI: The Complete Reference
Research Directions in Parallel Functional Programming
Research Directions in Parallel Functional Programming
Hi-index | 0.00 |
SPMD programs are usually written from the perspective of a single processor, yet the intended behaviour is an aggregate computation comprising many processors running the same program on local data. Combinators, suchas map, fold, scan and multibroadcast, provide a flexible way to express SPMD programs more naturally and more abstractly at the collective level. A good SPMD programming methodology begins with a specification at the collective level, where many significant transformations and optimisations can be introduced. Eventually, however, this collective level program must be transformed to the individual level in order to make it executable on an SPMD system. This paper introduces a technique needed to make the transformation possible within a formal framework: a special collective semantics for the individual level program is required in order to justify a transformation from the collective level to the individual level. The collective semantics defines the meanings of the collective communication operations, and it allows equational reasoning to be used for deriving and implementing SPMD programs.