Scaling probabilistic timing verification of hardware using abstractions in design source code
Proceedings of the International Conference on Formal Methods in Computer-Aided Design
Early prediction of NBTI effects using RTL source code analysis
Proceedings of the 49th Annual Design Automation Conference
Formal performance analysis for faulty MIMO hardware
IEEE Transactions on Very Large Scale Integration (VLSI) Systems
Compositional probabilistic verification through multi-objective model checking
Information and Computation
Hi-index | 0.01 |
Adaptive techniques like voltage and frequency scaling, process variations and the randomness of input data contribute significantly to the statistical aspect of contemporary hardware designs. Therefore, the performance metrics of such designs are also statistical in nature. In previous work, we have employed probabilistic model checking to rigorously evaluate the statistical performance of hardware designs. In this paper, we present an automatic compositional reasoning technique to improve the scalability of probabilistic model checking of hardware systems. We partition the set of system observables into disjoint subsets and use them to structurally decompose the system into smaller components. We employ an assume-guarantee form of reasoning and analyze the space of environmental constraints using a value-based case splitting approach. We split the space of values of all the observables of one component into separate value-based cases. We provide an argument for the correctness of our technique. We illustrate the effectiveness of our technique by making probabilistic model checking feasible for evaluating performance metrics such as delay and Bit Error Rate (BER) of non-trivial hardware designs that we use as case studies. For example, we are able to determine the statistical delay of a 64-bit adder design with over $10^{40}$ states. We use PRISM as the probabilistic model checking engine in all our experiments.