Hot-and-Cold: using criticality in the design of energy-efficient caches

  • Authors:
  • Rajeev Balasubramonian;Viji Srinivasan;Sandhya Dwarkadas;Alper Buyuktosunoglu

  • Affiliations:
  • School of Computing, University of Utah;IBM T.J. Watson Research Center;Department of Computer Science, University of Rochester;IBM T.J. Watson Research Center

  • Venue:
  • PACS'03 Proceedings of the Third international conference on Power - Aware Computer Systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

As technology scales and processor speeds improve, power has become a first-order design constraint in all aspects of processor design. In this paper, we explore the use of criticality metrics to reduce dynamic and leakage energy within data caches. We leverage the ability to predict whether an access is in the application’s critical path to partition the accesses into multiple streams. Accesses in the critical path are serviced by a high-performance (hot) cache bank. Accesses not in the critical path are serviced by a lower energy (and lower performance (cold)) cache bank. The resulting organization is a physically banked cache with different levels of energy consumption and performance in each bank. Our results demonstrate that such a classification of instructions and data across two streams can be achieved with high accuracy. Each additional cycle in the cold cache access time slows performance down by only 0.8%. However, such a partition can increase contention for cache banks and entail non-negligible hardware overhead. While prior research has effectively employed criticality metrics to reduce power in arithmetic units, our analysis shows that the success of these techniques are limited when applied to data caches.