Compilers: principles, techniques, and tools
Compilers: principles, techniques, and tools
Interprocedural dependence analysis and parallelization
SIGPLAN '86 Proceedings of the 1986 SIGPLAN symposium on Compiler construction
On the accuracy of the Banerjee test
Journal of Parallel and Distributed Computing - Special issue on shared-memory multiprocessors
Efficient and exact data dependence analysis
PLDI '91 Proceedings of the ACM SIGPLAN 1991 conference on Programming language design and implementation
PLDI '91 Proceedings of the ACM SIGPLAN 1991 conference on Programming language design and implementation
A practical algorithm for exact array dependence analysis
Communications of the ACM
Compiler transformations for high-performance computing
ACM Computing Surveys (CSUR)
The Banerjee-Wolfe and GCD tests on exact data dependence information
Journal of Parallel and Distributed Computing
Dependence Analysis for Supercomputing
Dependence Analysis for Supercomputing
High Performance Compilers for Parallel Computing
High Performance Compilers for Parallel Computing
Parallel Programming with Polaris
Computer
The I Test: An Improved Dependence Test for Automatic Parallelization and Vectorization
IEEE Transactions on Parallel and Distributed Systems
IEEE Transactions on Parallel and Distributed Systems
The impact of data dependence analysis on compilation and program parallelization
ICS '03 Proceedings of the 17th annual international conference on Supercomputing
A general data dependence analysis for parallelizing compilers
The Journal of Supercomputing
A general data dependence analysis to nested loop using integer interval theory
IPDPS'06 Proceedings of the 20th international conference on Parallel and distributed processing
From serial loops to parallel execution on distributed systems
Euro-Par'12 Proceedings of the 18th international conference on Parallel Processing
Hi-index | 0.00 |
Parallelizing compilers rely on data dependence information in order to produce valid parallel code. Traditional data dependence analysis techniques. such as the Banerjee test and the I-Test, can efficiently compute data dependence information for simple instances of the data dependence problem. However, in more complex cases involving triangular or trapezoidal loop regions, symbolic variables, and multidimensional arrays with coupled subscripts these tests, including the triangular Banerjee test, ignore or simplify many of the constraints and thus introduce approximations, especially when testing for data dependence under direction vector constraints. The Omega test can accurately handle such complex cases, but at a higher computation cost. In this paper we extend the ideas behind the I-Test and present new techniques to handle complex instances of tile dependence problem, which are frequently found in actual source code. In particular, we provide polynomial-time techniques that can prove or disprove data dependences, subject to any direction vector, in loops with triangular or trapezoidal bounds, symbolic variables and multidimensional arrays with coupled subscripts. We also investigate the impact of the proposed data dependence analysis techniques in practice. We perform an extensive experimental evaluation of the data dependence analysis tests, including the I-Test, the Omega test and the proposed new techniques. We compare these tests in terms of data dependence accuracy, compilation efficiency and effectiveness in program parallelization. We run several experiments using the Perfect Club benchmarks and the scientific library Lapack. We analyze the trade-off between accuracy and efficiency and the reasons for any approximation of each data dependence test. We determine the impact of the dependence analysis phase on the total compilation time and we measure the number of loops parallelized by each test. We conclude that we can employ, polynomial-time techniques to improve data dependence accuracy and increase program parallelization to a reasonable computation cost.