Multidimensional Blocking in UPC

  • Authors:
  • Christopher Barton;Călin Caşcaval;George Almasi;Rahul Garg;José Nelson Amaral;Montse Farreras

  • Affiliations:
  • University of Alberta, Edmonton, Canada;IBM T.J. Watson Research Center, ,;IBM T.J. Watson Research Center, ,;University of Alberta, Edmonton, Canada;University of Alberta, Edmonton, Canada;Barcelona Supercomputing Center, Universitat Politècnica de Catalunya,

  • Venue:
  • Languages and Compilers for Parallel Computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Partitioned Global Address Space (PGAS) languages offer an attractive, high-productivity programming model for programming large-scale parallel machines. PGAS languages, such as Unified Parallel C (UPC), combine the simplicity of shared-memory programming with the efficiency of the message-passing paradigm by allowing users control over the data layout. PGAS languages distinguish between private, shared-local, and shared-remote memory, with shared-remote accesses typically much more expensive than shared-local and private accesses, especially on distributed memory machines where shared-remote access implies communication over a network.In this paper we present a simple extension to the UPC language that allows the programmer to block shared arrays in multiple dimensions. We claim that this extension allows for better control of locality, and therefore performance, in the language.We describe an analysis that allows the compiler to distinguish between local shared array accesses and remote shared array accesses. Local shared array accesses are then transformed into direct memory accesses by the compiler, saving the overhead of a locality check at runtime. We present results to show that locality analysis is able to significantly reduce the number of shared accesses.