Capturing dynamic memory reference behavior with adaptive cache topology

  • Authors:
  • Jih-Kwon Peir;Yongjoon Lee;Windsor W. Hsu

  • Affiliations:
  • Computer & Information Science & Engineering Department, University of Florida, Gainesville, FL;Computer & Information Science & Engineering Department, University of Florida, Gainesville, FL;Computer Science Division, University of California, Berkeley, CA

  • Venue:
  • Proceedings of the eighth international conference on Architectural support for programming languages and operating systems
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

Memory references exhibit locality and are therefore not uniformly distributed across the sets of a cache. This skew reduces the effectiveness of a cache because it results in the caching of a considerable number of less-recently-used lines which are less likely to be re-referenced before they are replaced. In this paper, we describe a technique that dynamically identifies these less-recently-used lines and effectively utilizes the cache frames they occupy to more accurately approximate the global least-recently-used replacement policy while maintaining the fast access time of a direct-mapped cache. We also explore the idea of using these underutilized cache frames to reduce cache misses through data prefetching. In the proposed design, the possible locations that a line can reside in is not predetermined. Instead, the cache is dynamically partitioned into groups of cache lines. Because both the total number of groups and the individual group associativity adapt to the dynamic reference pattern, we call this design the adaptive group-associative cache. Performance evaluation using trace-driven simulations of the TPC-C benchmark and selected programs from the SPEC95 benchmark suite shows that the group-associative cache is able to achieve a hit ratio that is consistently better than that of a 4-way set-associative cache. For some of the workloads, the hit ratio approaches that of a fully-associative cache.