Timed input pattern generation for an accurate delay calculation under multiple input switching

  • Authors:
  • Seung Hoon Choi;Kunhyuk Kang;Florentin Dartu;Kaushik Roy

  • Affiliations:
  • Intel Corporation, Hillsboro, OR;Design Technology and Solutions, Intel Corporation, Hillsboro, OR;Synopsys Inc., Mountain View, CA;Electrical and Computer Engineering, Purdue University, West Lafayette, IN

  • Venue:
  • IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.03

Visualization

Abstract

In multiple input switching (MIS) analysis, input signal alignment is one of the key factors which determines the quality and the accuracy of the approach. In this paper, we propose a new signal alignment methodology for MIS analysis based on a transistor level simulator at the core of the static timing analysis. Our proposed methodology searches through the possible input vectors in an efficient order to reduce the number of simulations and finds a true worst case signal alignment for both the MIN and the MAX analysis. In our 180 nm simulation setup, the worst-case delay is predicted within 0.5% error for more than 97% of test cases performing an average of less than two simulations per logic gate.