Enterprise Transaction Processing Systems: Putting the Cobra Ots, Encina++ and Orbixotm to Work
Enterprise Transaction Processing Systems: Putting the Cobra Ots, Encina++ and Orbixotm to Work
Designing a test suite for empirically-based middleware performance prediction
CRPIT '02 Proceedings of the Fortieth International Conference on Tools Pacific: Objects for internet, mobile and embedded applications
Evaluating the Performance of EJB Components
IEEE Internet Computing
Generation of Distributed System Test-Beds from High-Level Software Architecture Descriptions
Proceedings of the 16th IEEE international conference on Automated software engineering
A case for test-code generation in model-driven systems
Proceedings of the 2nd international conference on Generative programming and component engineering
Model-Based Performance Prediction in Software Development: A Survey
IEEE Transactions on Software Engineering
An Environment for Automated Performance Evaluation of J2EE and ASP.NET Thin-client Architectures
ASWEC '04 Proceedings of the 2004 Australian Software Engineering Conference
Compositional Generation of Software Architecture Performance QN Models
WICSA '04 Proceedings of the Fourth Working IEEE/IFIP Conference on Software Architecture
Quality of Service in Middleware and Applications: A Model-Driven Approach
EDOC '04 Proceedings of the Enterprise Distributed Object Computing Conference, Eighth IEEE International
Automated Software Engineering
Accuracy of performance prediction for EJB applications: a statistical analysis
SEM'04 Proceedings of the 4th international conference on Software Engineering and Middleware
Toward a simulation-generated knowledge base of service performance
Proceedings of the 4th International Workshop on Middleware for Service Oriented Computing
Advancing software architecture modeling for large scale heterogeneous systems
Proceedings of the FSE/SDP workshop on Future of software engineering research
Ginpex: deriving performance-relevant infrastructure properties through goal-oriented experiments
Proceedings of the joint ACM SIGSOFT conference -- QoSA and ACM SIGSOFT symposium -- ISARCS on Quality of software architectures -- QoSA and architecting critical systems -- ISARCS
Systematic performance evaluation based on tailored benchmark applications
Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering
Hi-index | 0.00 |
This paper describes an approach for generating customized benchmark suites from a software architecture description following a Model Driven Architecture (MDA) approach. The benchmark generation and performance data capture tool implementation (MDABench) is based on widely used open source MDA frameworks. The benchmark application is modeled in UML and generated by taking advantage of the existing community-maintained code generation ''cartridges'' so that current component technology can be exploited. We have also tailored the UML 2.0 Testing Profile so architects can model the performance testing and data collection architecture in a standards compatible way. We then extended the MDA framework to generate a load testing suite and automatic performance measurement infrastructure. This greatly reduces the effort and expertise needed for benchmarking with complex component and Web service technologies while being fully MDA standard compatible. The approach complements current model-based performance prediction and analysis methods by generating the benchmark application from the same application architecture that the performance models are derived from. We illustrate the approach using two case studies based on Enterprise JavaBean component technology and Web services.