Scalable precision cache analysis for real-time software
ACM Transactions on Embedded Computing Systems (TECS) - Special Section LCTES'05
The worst-case execution-time problem—overview of methods and survey of tools
ACM Transactions on Embedded Computing Systems (TECS)
WCET determination tool for embedded systems software
Proceedings of the 1st international conference on Simulation tools and techniques for communications, networks and systems & workshops
Exploiting stack distance to estimate worst-case data cache performance
Proceedings of the 2009 ACM symposium on Applied Computing
System level performance analysis for real-time automotive multicore and network architectures
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Cache partitioning for energy-efficient and interference-free embedded multitasking
ACM Transactions on Embedded Computing Systems (TECS)
Context-aware TLB preloading for interference reduction in embedded multi-tasked systems
Proceedings of the 20th symposium on Great lakes symposium on VLSI
RMOT: recursion in model order for task execution time estimation in a software pipeline
Proceedings of the Conference on Design, Automation and Test in Europe
Tightening the bounds on feasible preemptions
ACM Transactions on Embedded Computing Systems (TECS)
OTAWA: an open toolbox for adaptive WCET analysis
SEUS'10 Proceedings of the 8th IFIP WG 10.2 international conference on Software technologies for embedded and ubiquitous systems
LP-NUCA: networks-in-cache for high-performance low-power embedded processors
IEEE Transactions on Very Large Scale Integration (VLSI) Systems
Hi-index | 0.00 |
Data caches significantly reduce the average memory access time and are necessary for an efficient design. Due to its direct dependency on input data is is difficult to predict the worst case timing behavior, which is crucial for a reliable system. While simulation is too time-consuming, current worst case execution time approaches focus on instruction caches only. Current approaches to data cache analysis restrict cache behavior to predictable data accesses or classify input dependent memory accesses as non-cacheable. In this paper we propose a worst case timing analysis for direct mapped data caches that classifies memory accesses as predictable or unpredictable. For unpredictable memory accesses, a novel analysis framework is proposed that tightly bounds the impact on the existing cache contents as well as cache behavior of unpredictable memory accesses themselves. For predictable memory accesses, we use a local cache simulation and data flow techniques. Furthermore, we describe an implementation of the analysis framework. Several experiments demonstrate its applicability. The approach targets real-time software verification but is also useful for design space exploration.