Performance evaluation of domain reference architectures

  • Authors:
  • K. Suzanne Barber;Jim Holt;Geoff Baker

  • Affiliations:
  • University of Texas, Austin;Motorola, Austin, TX;Motorola

  • Venue:
  • SEKE '02 Proceedings of the 14th international conference on Software engineering and knowledge engineering
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Architectures embody the requirements expressed by system stakeholders, and the type of architecture used to capture a given set of requirements dictates when evaluation can occur and what will be evaluated. This research aims to leverage requirements and a resulting architecture dictated by the problem domain and captured early in the lifecycle. Thus, the research goal is to provide early performance evaluation in an effort to convey the most accurate blueprint to system implementers specifically and system stakeholders in general. A new software architecture evaluation tool called Arcade, developed to support the Systems Engineering Process Activities (SEPA), automates early performance evaluation of software architectures using simulation. SEPA suggests a comprehensive approach to capture and represent different types of requirements as a multi-level software architecture. One SEPA architecture level, the Domain Reference Architecture (DRA) encompasses performance characteristics inherent to the domain in terms of what processes, data, and timing are required, rather than how a system should be implemented. Performance evaluation of a DRA can provide qualitative data to (1) aid in identification and validation of domain related performance concerns, and (2) provide system implementers with performance guidelines towards satisfying the performance constraints inherent to the domain. The range of performance statistics Arcade is capable of analyzing is demonstrated through a case study of a Motorola e-business project.