Using the SimOS machine simulator to study complex computer systems
ACM Transactions on Modeling and Computer Simulation (TOMACS)
MediaBench: a tool for evaluating and synthesizing multimedia and communicatons systems
MICRO 30 Proceedings of the 30th annual ACM/IEEE international symposium on Microarchitecture
Scheduling Algorithms for Multiprogramming in a Hard-Real-Time Environment
Journal of the ACM (JACM)
Evolvable Internet Hardware Platforms
EH '01 Proceedings of the The 3rd NASA/DoD Workshop on Evolvable Hardware
Using reconfigurability to achieve real-time profiling for hardware/software codesign
FPGA '04 Proceedings of the 2004 ACM/SIGDA 12th international symposium on Field programmable gate arrays
MiBench: A free, commercially representative embedded benchmark suite
WWC '01 Proceedings of the Workload Characterization, 2001. WWC-4. 2001 IEEE International Workshop
Statistical sampling of microarchitecture simulation
ACM Transactions on Modeling and Computer Simulation (TOMACS)
Extracting and improving microarchitecture performance on reconfigurable architectures
International Journal of Parallel Programming - Special issue: The next generation software program
CommBench-a telecommunications benchmark for network processors
ISPASS '00 Proceedings of the 2000 IEEE International Symposium on Performance Analysis of Systems and Software
ASX: an object-oriented framework for developing distributed applications
CTEC'94 Proceedings of the 6th conference on USENIX Sixth C++ Technical Conference - Volume 6
Performance evaluation for hybrid architectures
Performance evaluation for hybrid architectures
Visions for application development on hybrid computing systems
Parallel Computing
Hi-index | 0.00 |
Simulation has been the de facto standard method for performance evaluation of newly proposed ideas in computer architecture for many years. While simulation allows for theoretically arbitrary fidelity (at least to the level of cycle accuracy) as well as the ability to monitor the architecture without perturbing the execution itself, it suffers from low effective fidelity and long execution times. We (and others) have advocated the use of empirical experimentation on reconfigurable hardware for computer architecture performance assessment. In this paper, we describe an empirical performance assessment subsystem implemented in reconfigurable hardware and illustrate its use. Results are presented that demonstrate the need for the types of performance assessment that reconfigurable hardware can provide.