Cache partitioning for energy-efficient and interference-free embedded multitasking

  • Authors:
  • Rakesh Reddy;Peter Petrov

  • Affiliations:
  • InHand Electronics, Inc., Rockwille, MD;University of Maryland, College Park, MD

  • Venue:
  • ACM Transactions on Embedded Computing Systems (TECS)
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a technique that leverages configurable data caches to address the problem of energy inefficiency and intertask interference in multitasking embedded systems. Data caches are often necessary to provide the required memory bandwidth. However, caches introduce two important problems for embedded systems. Caches contribute to a significant amount of power as they typically occupy a large part of the chip and are accessed frequently. In nanometer technologies, such large structures contribute significantly to the total leakage power as well. Additionally, cache outcomes in multitasking environments are notoriously difficult to predict, if not impossible, thus resulting in poor real-time guarantees. We study the effect of multiprogramming workloads on the data cache in a preemptive multitasking environment, and propose a technique which leverages configurable cache architectures to not only eliminate intertask cache interference, but also to significantly reduce both dynamic and leakage power. By mapping tasks to different cache partitions, interference is completely eliminated. Dynamic and leakage power are significantly reduced as only a subset of the cache is active at any moment. We introduce a profile-based, off-line algorithm, which identifies a beneficial cache partitioning. The OS configures the data cache during context-switch by activating the corresponding partition. Our experiments on a large set of multitasking benchmarks demonstrate that our technique not only efficiently eliminates intertask interference, but also significantly reduces both dynamic and leakage power.