Constraint-based array dependence analysis

  • Authors:
  • William Pugh;David Wonnacott

  • Affiliations:
  • Univ. of Maryland, College Park;Haverford College, Haverford, PA

  • Venue:
  • ACM Transactions on Programming Languages and Systems (TOPLAS)
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

Traditional array dependence analysis, which detects potential memory aliasing of array references is a key analysis technique for automatic parallelization. Recent studies of benchmark codes indicate that limitations of analysis cause many compilers to overlook large amounts of potential parallelism, and that exploiting this parallelism requires algorithms to answer new question about array references, not just get better answers to the old questions of aliasing. We need to ask about the flow of values in arrays, to check the legality of array privatization, and about the conditions under which a dependence exists, to obtain information about conditional parallelism. In some cases, we must answer these questions about code containing nonlinear terms in loop bounds or subscripts. This article describes techniques for phrasing these questions in terms of systems of contstraints. Conditional dependence analysis can be performed with a constraint operation we call the "gist" operation. When subscripts and loop bounds are affine, questions about the flow of values in array variables can be phrased in terms of Presburger Arithmetic. When the constraints describing a dependence are not affine, we introduce uninterpreted function symbols to represent the nonaffine terms. Our constraint language also provides a rich language for communication with the dependence analyzer, by either the programmer or other phases of the compiler. This article also documents our investigations of the praticality of our approach. The worst-case complexity of Presburger Arithmetic indicates that it might be unsuitable for any practical application. However, we have found that analysis of benchmark programs does not cause the exponential growth in the number of constraints that could occur in the worst case. We have studied the constraints produced during our aanalysis, and identified characteristics that keep our algorithms free of exponential behavior in practice.