Online performance auditing: using hot optimizations without getting burned

  • Authors:
  • Jeremy Lau;Matthew Arnold;Michael Hind;Brad Calder

  • Affiliations:
  • IBM T.J. Watson Research Center and University of California, San Diego;IBM T.J. Watson Research Center;IBM T.J. Watson Research Center;University of California, San Diego

  • Venue:
  • Proceedings of the 2006 ACM SIGPLAN conference on Programming language design and implementation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

As hardware complexity increases and virtualization is added at more layers of the execution stack, predicting the performance impact of optimizations becomes increasingly difficult. Production compilers and virtual machines invest substantial development effort in performance tuning to achieve good performance for a range of benchmarks. Although optimizations typically perform well on average, they often have unpredictable impact on running time, sometimes degrading performance significantly. Today's VMs perform sophisticated feedback-directed optimizations, but these techniques do not address performance degradations, and they actually make the situation worse by making the system more unpredictable.This paper presents an online framework for evaluating the effectiveness of optimizations, enabling an online system to automatically identify and correct performance anomalies that occur at runtime. This work opens the door for a fundamental shift in the way optimizations are developed and tuned for online systems, and may allow the body of work in offline empirical optimization search to be applied automatically at runtime. We present our implementation and evaluation of this system in a product Java VM.