Performance Engineering of Software Systems
Performance Engineering of Software Systems
Model-Based Performance Prediction in Software Development: A Survey
IEEE Transactions on Software Engineering
Microsoft Windows Internals, Fourth Edition: Microsoft Windows Server(TM) 2003, Windows XP, and Windows 2000 (Pro-Developer)
Stability of Feature Selection Algorithms
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Automated benchmarking and analysis tool
valuetools '06 Proceedings of the 1st international conference on Performance evaluation methodolgies and tools
Characterization of Computational Grid Resources Using Low-Level Benchmarks
E-SCIENCE '06 Proceedings of the Second IEEE International Conference on e-Science and Grid Computing
MDABench: Customized benchmark generation using MDA
Journal of Systems and Software
Measuring CPU overhead for I/O processing in the Xen virtual machine monitor
ATEC '05 Proceedings of the annual conference on USENIX Annual Technical Conference
Revel8or: Model Driven Capacity Planning Tool Suite
ICSE '07 Proceedings of the 29th international conference on Software Engineering
Open versus closed: a cautionary tale
NSDI'06 Proceedings of the 3rd conference on Networked Systems Design & Implementation - Volume 3
Model-Driven Generation of Performance Prototypes
SIPEW '08 Proceedings of the SPEC international workshop on Performance Evaluation: Metrics, Models and Benchmarks
The Palladio component model for model-driven performance prediction
Journal of Systems and Software
Profiling and Modeling Resource Usage of Virtualized Applications
Middleware '08 Proceedings of the ACM/IFIP/USENIX 9th International Middleware Conference
Performance evaluation of component-based software systems: A survey
Performance Evaluation
Parametric performance completions for model-driven performance prediction
Performance Evaluation
MASCOTS '10 Proceedings of the 2010 IEEE International Symposium on Modeling, Analysis and Simulation of Computer and Telecommunication Systems
The Performance Cockpit Approach: A Framework For Systematic Performance Evaluations
SEAA '10 Proceedings of the 2010 36th EUROMICRO Conference on Software Engineering and Advanced Applications
IEEE Transactions on Software Engineering
Performance Analysis of Cloud Computing Services for Many-Tasks Scientific Computing
IEEE Transactions on Parallel and Distributed Systems
Experimental evaluation of the performance-influencing factors of virtualized storage systems
EPEW'12 Proceedings of the 9th European conference on Computer Performance Engineering
Experimental evaluation of the performance-influencing factors of virtualized storage systems
EPEW'12 Proceedings of the 9th European conference on Computer Performance Engineering
Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering
Hi-index | 0.00 |
In software performance engineering, the infrastructure on which an application is running plays a crucial role when predicting the performance of the application. Thus, to yield accurate prediction results, performance-relevant properties and behaviour of the infrastructure have to be integrated into performance models. However, capturing these properties is a cumbersome and error-prone task, as it requires carefully engineered measurements and experiments. Existing approaches for creating infrastructure performance models require manual coding of these experiments, or ignore the detailed properties in the models. The contribution of this paper is the Ginpex approach, which introduces goal-oriented and model-based specification and generation of executable performance experiments for detecting and quantifying performance relevant infrastructure properties. Ginpex provides a metamodel for experiment specification and comes with pre-defined experiment templates that provide automated experiment execution on the target platform and also automate the evaluation of the experiment results. We evaluate Ginpex using two case studies, where experiments are executed to detect the operating system scheduler timeslice length, and to quantify the CPU virtualization overhead for an application executed in a virtualized environment.