Courteous cache sharing: being nice to others in capacity management

  • Authors:
  • Akbar Sharifi;Shekhar Srikantaiah;Mahmut Kandemir;Mary Jane Irwin

  • Affiliations:
  • The Pennsylvania State University, University Park, PA;The Pennsylvania State University, University Park, PA;The Pennsylvania State University, University Park, PA;The Pennsylvania State University, University Park, PA

  • Venue:
  • Proceedings of the 49th Annual Design Automation Conference
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a cache management scheme for multiprogrammed, multithreaded applications, with the objective of obtaining maximum performance for both individual applications and the multithreaded workload mix. In this scheme, each individual application's performance is improved by increasing the priority of its slowest thread, while the overall system performance is improved by ensuring that each individual application's performance benefit does not come at the cost of a significant degradation to other application's threads that are sharing the same cache. Averaged over six workloads, our shared cache management scheme improves the performance of the combination of applications by 18%. These improvements across applications in each mix are also fair, as indicated by average fair speedup improvements of 10% across the threads of each application (averaged over all the workloads).