Impact of Gate-Length Biasing on Threshold-Voltage Selection

  • Authors:
  • Andrew B. Kahng;Swamy Muddu;Puneet Sharma

  • Affiliations:
  • ECE and CSE UC San Diego;ECE UC San Diego;ECE UC San Diego

  • Venue:
  • ISQED '06 Proceedings of the 7th International Symposium on Quality Electronic Design
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gate-length biasing is a runtime leakage reduction technique that leverages on the short-channel effect by marginally increasing the gate-length of MOS devices to significantly reduce their leakage current for a small delay penalty. The technique was shown to work effectively with multi- thresholdvoltage assignment, the only mainstream approach for runtime leakage reduction. Typically, designers use threshold voltages selected by the foundries to optimize their designs. Higher threshold-voltage devices, that are less leaky but slow, are assigned to non-critical paths and lower threshold-voltage devices, that are fast but leaky, are assigned to critical paths. In this paper we study the effect of modifying threshold voltages set by the foundry on leakage reduction achieved on three large, real-world designs. We assess the impact of the availability of gate-length biasing on threshold voltage selection. We achieve comparable leakage reductions when foundry-set dual threshold voltages are used with biasing than when foundry-set triple threshold voltages are used without biasing. Our results indicate that leakage reductions can be improved if threshold voltages are carefully chosen considering the availability of gate-length biasing. We also observe that foundry-set threshold voltages are not optimal for achieving best possible leakage reductions.