Data cache locking for higher program predictability

  • Authors:
  • Xavier Vera;Björn Lisper;Jingling Xue

  • Affiliations:
  • Institutionen för Datateknik, Mälardalens Högskola, Västerå s, Sweden;Institutionen för Datateknik, Mälardalens Högskola, Västerå s, Sweden;University of New South Wales, Sydney, Australia

  • Venue:
  • SIGMETRICS '03 Proceedings of the 2003 ACM SIGMETRICS international conference on Measurement and modeling of computer systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Caches have become increasingly important with the widening gap between main memory and processor speeds. However, they are a source of unpredictability due to their characteristics, resulting in programs behaving in a different way than expected.Cache locking mechanisms adapt caches to the needs of real-time systems. Locking the cache is a solution that trades performance for predictability: at a cost of generally lower performance, the time of accessing the memory becomes predictable.This paper combines compile-time cache analysis with data cache locking to estimate the worst-case memory performance (WCMP) in a safe, tight and fast way. In order to get predictable cache behavior, we first lock the cache for those parts of the code where the static analysis fails. To minimize the performance degradation, our method loads the cache, if necessary, with data likely to be accessed.Experimental results show that this scheme is fully predictable, without compromising the performance of the transformed program. When compared to an algorithm that assumes compulsory misses when the state of the cache is unknown, our approach eliminates all overestimation for the set of benchmarks, giving an exact WCMP of the transformed program without any significant decrease in performance.