Generating explicit communication from shared-memory program references

  • Authors:
  • Jingke Li;Marina Chen

  • Affiliations:
  • Department of Computer Science, Yale University, P.O. Box 2158, Yale Station, New Haven, CT;Department of Computer Science, Yale University, P.O. Box 2158, Yale Station, New Haven, CT

  • Venue:
  • Proceedings of the 1990 ACM/IEEE conference on Supercomputing
  • Year:
  • 1990

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the problem of data distribution and communication synthesis in generating parallel programs targeted for massively parallel, distributed-memory machines. The source programs can be sequential, functional, or parallel programs based on a shared-memory model. Our approach is to analyze source program references and match syntactic reference patterns with appropriate aggregate communication routines which can be implemented efficiently on the target machine. We use an explicit communication metric to guide optimizations to reduce communication overhead. The target code with explicit communication is proven to be free from deadlock introduced by the compilation process.