Pipeline damping: a microarchitectural technique to reduce inductive noise in supply voltage

  • Authors:
  • Michael D. Powell;T. N. Vijaykumar

  • Affiliations:
  • Purdue University;Purdue University

  • Venue:
  • Proceedings of the 30th annual international symposium on Computer architecture
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Scaling of CMOS technology causes the power supply voltages to fall and supply currents to rise at the same time as operating speeds are increasing. Falling supply voltages cause noise margins to decrease, while increasing current and frequency makes supply noise injection larger, especially noise caused by inductance in the supply lines. Creating power distribution systems is one of the key challenges in modern chip design. Decoupling capacitance helps reduce inductance effects, but there is often a peak in the supply impedance that occurs at a resonant frequency caused roughly by the package inductance and the chip decoupling capacitors. This frequency is on the order of 100MHz, which is much lower than the operating frequency of the processor. We propose pipeline damping, an architectural technique which controls instruction issue to guarantee bounds on current variation around the frequency of the supply resonance, thus reducing the resulting supply noise. Damping is a cheaper alternative to expensive, circuit-based noise-reduction techniques. We make the fundamental observation that limiting the current flow change (di) within resonant time period (dt) controls di/dt without large performance loss. Damping guarantees bounds on current variation while allowing processor current to increase or decrease to the magnitude required to maintain performance. Our results show that a damped processor guarantees a 33% reduction in the worst-case current variation with an average performance degradation of 7% and average energy delay of 1.09 compared to an undamped processor.