Early performance testing of distributed software applications
WOSP '04 Proceedings of the 4th international workshop on Software and performance
MDABench: Customized benchmark generation using MDA
Journal of Systems and Software
Auto-pilot: a platform for system software benchmarking
ATEC '05 Proceedings of the annual conference on USENIX Annual Technical Conference
Model-Based Testing Using Scenarios and Event-B Refinements
Methods, Models and Tools for Fault Tolerance
Performance evaluation of component-based software systems: A survey
Performance Evaluation
The Performance Cockpit Approach: A Framework For Systematic Performance Evaluations
SEAA '10 Proceedings of the 2010 36th EUROMICRO Conference on Software Engineering and Advanced Applications
A generic methodology to derive domain-specific performance feedback for developers
Proceedings of the 34th International Conference on Software Engineering
Automated inference of goal-oriented performance prediction functions
Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering
Hi-index | 0.00 |
Performance (i.e., response time, throughput, resource consumption) is a key quality metric of today's applications as it heavily affects customer satisfaction. SAP strives to identify and fix performance problems before customers face them. Therefore, performance engineering methods are applied in all stages of the software lifecycle. However, especially in the development phase continuous performance evaluations can introduce a lot of overhead for developers which hinders their broad application in practice. In order to evaluate the performance of a certain software artefact (e.g. comparing two design alternatives), a developer has to run measurements that are tailored to the software artefact under test. The use of standard benchmarks would create less overhead, but the information gain is often not sufficient to answer the specific questions of developers. In this industrial paper, we present an approach that enables exhaustive, tailored performance testing with minimal effort for developers. The approach allows to define benchmark applications through a domain-specific model and realizes the transformation of those models to benchmark applications via a generic Benchmark Framework. The application of the approach in the context of the SAP Netweaver Cloud development environment demonstrated that we can efficiently identify performance problems that would not have been detected by our existing performance test infrastructure.