Data Caches in Multitasking Hard Real-Time Systems

  • Authors:
  • Xavier Vera;Björn Lisper;Jingling Xue

  • Affiliations:
  • -;-;-

  • Venue:
  • RTSS '03 Proceedings of the 24th IEEE International Real-Time Systems Symposium
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Data caches are essential in modern processors, bridgingthe widening gap between main memory and processorspeeds. However, they yield very complex performancemodels, which makes it hard to bound execution timestightly.This paper contributes a new technique to obtain predictabilityin preemptive multitasking systems in the presenceof data caches. We explore the use of cache partitioning,dynamic cache locking and static cache analysis to provideworst-case performance estimates in a safe and tightway. Cache partitioning divides the cache among tasks toeliminate inter-task cache interferences. We combine staticcache analysis and cache locking mechanisms to ensurethat all intra-task conflicts, and consequently, memory accesstimes, are exactly predictable. To minimize the performancedegradation due to cache partitioning and locking,two strategies are employed. First, the cache is loaded withdata likely to be accessed so that their cache utilization ismaximized. Second, compiler optimizations such as tilingand padding are applied in order to reduce cache replacementmisses.Experimental results show that this scheme is fully predictable,without compromising the performance of thetransformed programs. Our method outperforms staticcache locking for all analyzed task sets under variouscache architectures, with a CPU utilization reductionranging between 3.8 and 20.0 times for a high performancesystem.