Improving IPC by kernel design
SOSP '93 Proceedings of the fourteenth ACM symposium on Operating systems principles
Continuous profiling: where have all the cycles gone?
ACM Transactions on Computer Systems (TOCS)
System support for automatic profiling and optimization
Proceedings of the sixteenth ACM symposium on Operating systems principles
Fine-grained dynamic instrumentation of commodity operating system kernels
OSDI '99 Proceedings of the third symposium on Operating systems design and implementation
A compliant persistent architecture
Software—Practice & Experience - Persistent object systems
Gprof: A call graph execution profiler
SIGPLAN '82 Proceedings of the 1982 SIGPLAN symposium on Compiler construction
HOTOS '97 Proceedings of the 6th Workshop on Hot Topics in Operating Systems (HotOS-VI)
An API for Runtime Code Patching
International Journal of High Performance Computing Applications
Building Java program analysis tools using Javana
Companion to the 21st ACM SIGPLAN symposium on Object-oriented programming systems, languages, and applications
Solaris Dynamic Tracing Guide
Dynamic instrumentation of production systems
ATEC '04 Proceedings of the annual conference on USENIX Annual Technical Conference
ECSA'07 Proceedings of the First European conference on Software Architecture
Hi-index | 0.00 |
Profiling consists of three stages: the collection of performance data, the processing of that data to infer performance information and the feedback of this performance information into the system. Feedback refers to using the information to change either the runtime system or the sampling parameters for subsequent profiling runs. This paper will concentrate on the latter approach for feedback. The majority of existing profiling tools focus on data collection, requiring manual intervention during processing and feedback. The developer must interpret the results presented to them to identify new profiling strategies. We introduce the concept of profiling regulation, whereby the processes of collection, processing and feedback are automated. We define a domain specific language, sampspec, that provides expressibility and control over the profiling process. The developer provides a declarative specification of the information to collect, the computations to perform and the strategies to employ based on this information. This is in contrast to the manual inspection of results and restarting the profiler. Thus, the profiling process becomes one of specification of strategies for data collection and processing, and how these strategies can adapt over time. In this paper, we describe the system model and illustrate our language through a series of worked examples.