UML components: a simple process for specifying component-based software
UML components: a simple process for specifying component-based software
Summarizing application performance from a components perspective
Proceedings of the 10th European software engineering conference held jointly with 13th ACM SIGSOFT international symposium on Foundations of software engineering
Selective profiling of Java applications using dynamic bytecode instrumentation
ISPASS '04 Proceedings of the 2004 IEEE International Symposium on Performance Analysis of Systems and Software
CoCoME - The Common Component Modeling Example
The Common Component Modeling Example
The Palladio component model for model-driven performance prediction
Journal of Systems and Software
Improved Feedback for Architectural Performance Prediction Using Software Cartography Visualizations
QoSA '09 Proceedings of the 5th International Conference on the Quality of Software Architectures: Architectures for Adaptive Software Systems
Assigning Blame: Mapping Performance to High Level Parallel Programming Abstractions
Euro-Par '09 Proceedings of the 15th International Euro-Par Conference on Parallel Processing
Usage profile and platform independent automated validation of service behavior specifications
Proceedings of the 2nd International Workshop on the Quality of Service-Oriented Software Systems
Palladio-based performance blame analysis
Proceedings of the 16th international workshop on Component-oriented programming
An accuracy information annotation model for validated service behavior specifications
MODELS'10 Proceedings of the 2010 international conference on Models in software engineering
Hi-index | 0.00 |
When developing component-based systems, we incorporate third-party black-box components. For each component, performance contracts have been specified by their developers. If errors occur when testing the system built from these components, it is very important to find out whether components violate their performance contracts or whether the composition itself is faulty. This task is called performance blame analysis. In our previous work we presented a performance blame analysis approach that blames components based on a comparison of response time values from the failed test case to expected values derived from the performance contract. In that approach, the system architect needs to manually assess if the test data series shows faster or slower response times than the data derived from the contract. This is laborious as the system architect has to do this for each component operation. In this paper we present an automated comparison of each pair of data series as decision support. In contrast to our work, other approaches do not achieve fully automated decision support, because they do not incorporate sophisticated contracts. We exemplify our performance blame analysis including the automated decision support using the "Common Component Modeling Example" (CoCoME) benchmark.