Extending data prefetching to cope with context switch misses

  • Authors:
  • Hanyu Cui;Suleyman Sair

  • Affiliations:
  • Qualcomm;Intel

  • Venue:
  • ICCD'09 Proceedings of the 2009 IEEE international conference on Computer design
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Among the various costs of a context switch, its impact on the performance of L2 caches is the most significant because of the resulting high miss penalty. To reduce the impact of frequent context switches, we propose restoring a program's locality by prefetching into the L2 cache the data a program was using before it was swapped out. A Global History List is used to record a process' L2 read accesses in LRU order. These accesses are saved along with the process' context when the process is swapped out and loaded to guide prefetching when it is swapped in. We also propose a feedback mechanism that greatly reduces memory traffic incurred by our prefetching scheme. Experiments show significant speedup over baseline architectures with and without traditional prefetching in the presence of frequent context switches.