An exact algorithm for selecting partial scan flip-flops
DAC '94 Proceedings of the 31st annual Design Automation Conference
Enhanced visibility and performance in functional verification by reconstruction
DAC '98 Proceedings of the 35th annual Design Automation Conference
Design for Debug: Catching Design Errors in Digital Chips
IEEE Design & Test
Scan-chain based watch-points for efficient run-time debugging and verification of FPGA designs
ASP-DAC '03 Proceedings of the 2003 Asia and South Pacific Design Automation Conference
On computing the minimum feedback vertex set of a directed graph by contraction operations
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Assertion-based verification of a 32 thread SPARC™ CMT microprocessor
Proceedings of the 45th annual Design Automation Conference
Combined simulation and emulation setup for complex image processing algorithms in VHDL
Proceedings of the 6th FPGAworld Conference
Proceedings of the International Conference on Computer-Aided Design
Hi-index | 0.00 |
Logic simulators are still the most popular verification tools, and they can provide full controllability and visibility during the verification process. However, their simulation speed is too slow for a large amount of input patterns. Higher speeds are possible with hardware emulation such as FPGAs. But, because of poor visibility in the FPGAs, it is very hard to debug using this approach. The work described in this article focuses on building similar debugging capabilities for low-cost FPGAs that currently are available only in expensive emulators, such as the full visibility provided by software simulators. The authors propose an efficient approach to record an FPGA's internal behavior and replay the interesting period of time in a software simulator. High simulation speed is still possible with this approach because most simulation efforts are completed in the FPGA. Besides this, full visibility and a better debugging environment can be provided in the software simulation while replaying the time frames with errors. To reduce hardware overhead, the authors also propose an algorithm to minimize the amount of recorded data. Experimental results confirm the efficiency of using this approach.