Software architecture in practice
Software architecture in practice
Using analytic models predicting middleware performance
Proceedings of the 2nd international workshop on Software and performance
Java 2 distributed object middleware performance analysis and optimization
ACM SIGPLAN Notices
Java 2 Platform Enterprise Edition Bundle and Applying Enterprise Javabeans Package
Java 2 Platform Enterprise Edition Bundle and Applying Enterprise Javabeans Package
Software Architecture Analysis-A Case Study
COMPSAC '99 23rd International Computer Software and Applications Conference
Software Resource Architecture and Performance Evaluation of Software Architectures
HICSS '01 Proceedings of the 34th Annual Hawaii International Conference on System Sciences ( HICSS-34)-Volume 9 - Volume 9
xADL: Enabling Architecture-Centric Tool Integration with XML
HICSS '01 Proceedings of the 34th Annual Hawaii International Conference on System Sciences ( HICSS-34)-Volume 9 - Volume 9
A Performance Prototyping Approach to Designing Concurrent Software Architectures
PDSE '97 Proceedings of the 2nd International Workshop on Software Engineering for Parallel and Distributed Systems
Dynamic Distributed Software Architecture Design with PARSE-DAT
SMT '00 Proceedings of the International Conference on software Methods and Tools (SMT'00)
Evaluating Enterprise Java Bean Technology
SMT '00 Proceedings of the International Conference on software Methods and Tools (SMT'00)
Visualising Complex Control Flow
VL '98 Proceedings of the IEEE Symposium on Visual Languages
High-Level Static and Dynamic Visualization of Software Architectures
VL '00 Proceedings of the 2000 IEEE International Symposium on Visual Languages (VL'00)
Rose/Architect: A Tool to Visualize Architecture
HICSS '99 Proceedings of the Thirty-second Annual Hawaii International Conference on System Sciences-Volume 8 - Volume 8
Designing a test suite for empirically-based middleware performance prediction
CRPIT '02 Proceedings of the Fortieth International Conference on Tools Pacific: Objects for internet, mobile and embedded applications
A case for test-code generation in model-driven systems
Proceedings of the 2nd international conference on Generative programming and component engineering
Experiences Integrating and Scaling a Performance Test Bed Generator with an Open Source CASE Tool
Proceedings of the 19th IEEE international conference on Automated software engineering
Automated Software Engineering
Deployed software component testing using dynamic validation agents
Journal of Systems and Software - Special issue: Automated component-based software engineering
Performance prediction of component-based applications
Journal of Systems and Software - Special issue: Automated component-based software engineering
MDAbench: a tool for customized benchmark generation using MDA
OOPSLA '05 Companion to the 20th annual ACM SIGPLAN conference on Object-oriented programming, systems, languages, and applications
Clearwater: extensible, flexible, modular code generation
Proceedings of the 20th IEEE/ACM international Conference on Automated software engineering
Model driven benchmark generation for web services
Proceedings of the 2006 international workshop on Service-oriented software engineering
MDABench: Customized benchmark generation using MDA
Journal of Systems and Software
Revel8or: Model Driven Capacity Planning Tool Suite
ICSE '07 Proceedings of the 29th international conference on Software Engineering
IEEE Transactions on Software Engineering
A framework for visual notation exchange
Journal of Visual Languages and Computing
Validating model-driven performance predictions on random software systems
QoSA'10 Proceedings of the 6th international conference on Quality of Software Architectures: research into Practice - Reality and Gaps
Generating service models by trace subsequence substitution
Proceedings of the 9th international ACM Sigsoft conference on Quality of software architectures
Hi-index | 0.00 |
Most distributed system specifications have performancebenchmark requirements. However, determining the likelyperformance of complex distributed system architecturesduring development is very challenging. We describe asystem where software architects sketch an outline oftheir proposed system architecture at a high level ofabstraction, including indicating client requests, serverservices, and choosing particular kinds of middlewareand database technologies. A fully-workingimplementation of this system is then automaticallygenerated, allowing multiple clients and servers to be run.Performance tests are then automatically run for thisgenerated code and results are displayed back in theoriginal high-level architectural diagrams. Architectsmay change performance parameters and architecturecharacteristics, comparing multiple test run results todetermine the most suitable abstractions to refine todetailed designs for actual system implementation. Wedemonstrate the utility of this approach and the accuracyof our generated performance test-beds for validatingarchitectural choices during early system development.