Measuring the performance of communication middleware on high-speed networks
Conference proceedings on Applications, technologies, architectures, and protocols for computer communications
Flick: a flexible, optimizing IDL compiler
Proceedings of the ACM SIGPLAN 1997 conference on Programming language design and implementation
CORBA design patterns
Evaluating architectures for multithreaded object request brokers
Communications of the ACM
Alleviating Priority Inversion and Non-Determinism in Real-Time CORBA ORB Core Architectures
RTAS '98 Proceedings of the Fourth IEEE Real-Time Technology and Applications Symposium
Evaluating CORBA latency and scalability over high-speed ATM networks
ICDCS '97 Proceedings of the 17th International Conference on Distributed Computing Systems (ICDCS '97)
The design of the TAO real-time object request broker
Computer Communications
CORBA: integrating diverse applications within distributed heterogeneous environments
IEEE Communications Magazine
A high-performance end system architecture for real-time CORBA
IEEE Communications Magazine
Failure Mode Analysis of CORBA Service Implementations
Middleware '01 Proceedings of the IFIP/ACM International Conference on Distributed Systems Platforms Heidelberg
Robustness Testing and Hardening of CORBA ORB Implementations
DSN '01 Proceedings of the 2001 International Conference on Dependable Systems and Networks (formerly: FTCS)
Experiences Integrating and Scaling a Performance Test Bed Generator with an Open Source CASE Tool
Proceedings of the 19th IEEE international conference on Automated software engineering
Automated Software Engineering
Hi-index | 0.00 |
The performance of CORBA (Common Object Request Broker Architecture) objects is greatly influenced by the application context and by the performance of the ORB endsystem, which consists of the middleware, the operating system and the underlying network. Application developers need to evaluate how candidate application object architectures will perform within heterogenous computing environments, but a lack of standard and user extendable performance benchmark suites exercising all aspects of the ORB endsystem under realistic application scenarios makes this difficult. This paper introduces the Performance Pattern Language and the Performance Measurement Object which address these problems by providing an automated script based framework within which extensive ORB endsystem performance benchmarks may be efficiently described and automatically executed.