Using a Soft Core in a SoC Design: Experiences with picoJava
IEEE Design & Test
Analog and Mixed-Signal Benchmark Circuits-First Release
Proceedings of the IEEE International Test Conference
Evolution of synthetic RTL benchmark circuits with predefined testability
ACM Transactions on Design Automation of Electronic Systems (TODAES)
BenCGen: a digital circuit generation tool for benchmarks
Proceedings of the 21st annual symposium on Integrated circuits and system design
Benchmarking in digital circuit design
MINO'08 Proceedings of the 7th WSEAS International Conference on Microelectronics, Nanoelectronics, Optoelectronics
Benchmarking in digital circuit design automation
WSEAS Transactions on Circuits and Systems
Generating application-specific benchmark models for complex systems
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 1
Automated benchmark model generators for model-based diagnostic inference
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Partitioning and scheduling of task graphs on partially dynamically reconfigurable FPGAs
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Automated model generation for complex systems
MIC '08 Proceedings of the 27th IASTED International Conference on Modelling, Identification and Control
A benchmark diagnostic model generation system
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans - Special issue on model-based diagnostics
Hi-index | 0.00 |
One of the most difficult tasks CAD users face is the evaluation and comparison of different tools and algorithms. For commercial software purchasers, it is vital to understand how well a given tool does the required job and which of many possible choices is best for the kinds of problems a user will face. For the tool developer, whether in academia or industry, the efficiency of critical algorithms must be measured and compared to understand both tool behavior and progress over time.