Using regression splines for software performance analysis
Proceedings of the 2nd international workshop on Software and performance
WOSP '02 Proceedings of the 3rd international workshop on Software and performance
Resource Function Capture for Performance Aspects of Software Components and Sub-Systems
Performance Engineering, State of the Art and Current Trends
SKaMPI: A Detailed, Accurate MPI Benchmark
Proceedings of the 5th European PVM/MPI Users' Group Meeting on Recent Advances in Parallel Virtual Machine and Message Passing Interface
Evaluating the Performance of EJB Components
IEEE Internet Computing
Early performance testing of distributed software applications
WOSP '04 Proceedings of the 4th international workshop on Software and performance
Model-Based Performance Prediction in Software Development: A Survey
IEEE Transactions on Software Engineering
Dynamic estimation of CPU demand of web traffic
valuetools '06 Proceedings of the 1st international conference on Performance evaluation methodolgies and tools
SAP Performance Optimization Guide: Analyzing and Tuning SAP Systems
SAP Performance Optimization Guide: Analyzing and Tuning SAP Systems
Software performance in the real world: personal lessons from the performance trauma team
WOSP '07 Proceedings of the 6th international workshop on Software and performance
Performance Evaluation and Prediction for Legacy Information Systems
ICSE '07 Proceedings of the 29th international conference on Software Engineering
Mulini: an automated staging framework for QoS of distributed multi-tier applications
Proceedings of the 2007 workshop on Automating service quality: Held at the International Conference on Automated Software Engineering (ASE)
A framework for measurement based performance modeling
WOSP '08 Proceedings of the 7th international workshop on Software and performance
The Palladio component model for model-driven performance prediction
Journal of Systems and Software
Geospatial Analysis: A Comprehensive Guide to Principles, Techniques and Software Tools
Geospatial Analysis: A Comprehensive Guide to Principles, Techniques and Software Tools
Enhanced inferencing: estimation of a workload dependent performance model
Proceedings of the Fourth International ICST Conference on Performance Evaluation Methodologies and Tools
Estimating service resource consumption from response time measurements
Proceedings of the Fourth International ICST Conference on Performance Evaluation Methodologies and Tools
Performance evaluation of component-based software systems: A survey
Performance Evaluation
The Performance Cockpit Approach: A Framework For Systematic Performance Evaluations
SEAA '10 Proceedings of the 2010 36th EUROMICRO Conference on Software Engineering and Advanced Applications
Performance cockpit: systematic measurements and analyses
Proceedings of the 2nd ACM/SPEC International Conference on Performance engineering
Statistical inference of software performance models for parametric performance completions
QoSA'10 Proceedings of the 6th international conference on Quality of Software Architectures: research into Practice - Reality and Gaps
Systematic adoption of genetic programming for deriving software performance curves
ICPE '12 Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering
Integrating software performance curves with the palladio component model
ICPE '12 Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering
A generic methodology to derive domain-specific performance feedback for developers
Proceedings of the 34th International Conference on Software Engineering
Automated inference of goal-oriented performance prediction functions
Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering
Hi-index | 0.00 |
The performance of today's enterprise applications is influenced by a variety of parameters across different layers. Thus, evaluating the performance of such systems is a time and resource consuming process. The amount of possible parameter combinations and configurations requires many experiments in order to derive meaningful conclusions. Although many tools for automated performance testing are available, controlling experiments and analyzing results still requires large manual effort. In this paper, we apply statistical model inference techniques, namely Kriging and MARS, in order to adaptively select experiments. Our approach automatically selects and conducts experiments based on the accuracy observed for the models inferred from the currently available data. We validated the approach using an industrial ERP scenario. The results demonstrate that we can automatically infer a prediction model with a mean relative error of 1.6% using only 18% of the measurement points in the configuration space.