Incorporation of OpenMP memory consistency into conventional dataflow analysis
IWOMP'08 Proceedings of the 4th international conference on OpenMP in a new era of parallelism
Operational reasoning for concurrent caml programs and weak memory models
TPHOLs'07 Proceedings of the 20th international conference on Theorem proving in higher order logics
There is nothing wrong with out-of-thin-air: compiler optimization and memory models
Proceedings of the 2011 ACM SIGPLAN Workshop on Memory Systems Performance and Correctness
Hi-index | 0.01 |
The design of consistency models for both hardware and software is a difficult task. For a programming language, it is particularly difficult because the target audience for a high-level programming language is much wider than the target audience for a machine language, making usability a more important criterion. Exacerbating this problem is the reality that the programming language community has little experience designing programming language consistency models, and therefore each new attempt is very much a voyage into uncharted territory. A concrete example of the difficulties of the task is the current Java Memory Model. Although designed to be easy to use by Java programmers, it is poorly understood and at least one common idiom (the ‘double check idiom’) to exploit the model is unsafe. In this paper, we describe the design of an optimizing Java compiler that will accept either as input or as an interface implementation a consistency model for the code to be compiled. The compiler will use Shasha and Snir's delay set analysis, and our CSSA program representation to provide a canonical representation for the effects of different consistency models on optimizations and analysis. The compiler will serve as a testbed to prototype new memory models, and to measure the effects of different memory models on program performance. Copyright © 2004 John Wiley & Sons, Ltd.