Exploiting phase inter-dependencies for faster iterative compiler optimization phase order searches

  • Authors:
  • Michael R. Jantz;Prasad A. Kulkarni

  • Affiliations:
  • University of Kansas;University of Kansas

  • Venue:
  • Proceedings of the 2013 International Conference on Compilers, Architectures and Synthesis for Embedded Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The problem of finding the most effective set and ordering of optimization phases to generate the best quality code is a fundamental issue in compiler optimization research. Unfortunately, the exorbitantly large phase order search spaces in current compilers make both exhaustive as well as heuristic approaches to search for the ideal optimization phase combination impractical in most cases. In this paper we show that one important reason existing search techniques are so expensive is because they make no attempt to exploit well-known independence relationships between optimization phases to reduce the search space, and correspondingly improve the search time. We explore the impact of two complementary techniques to prune typical phase order search spaces. Our first technique studies the effect of implicit application of cleanup phases, while the other partitions the set of phases into mutually independent groups and develops new multi-stage search algorithms that substantially reduce the search time with no effect on best delivered code performance. Together, our techniques prune the exhaustive phase order search space size by 89%, on average, (96.75% total search space reduction) and show immense potential at making iterative phase order searches more feasible and practical. The pruned search space enables us to find a small set of distinct phase sequences that reach near-optimal phase ordering performance for all our benchmark functions as well as to improve the behavior of our genetic algorithm based heuristic search.