Model-driven generative framework for automated OMG DDS performance testing in the cloud

  • Authors:
  • Kyoungho An;Takayuki Kuroda;Aniroddha Gokhale;Sumant Tambe;Andrea Sorbini

  • Affiliations:
  • Vanderbilt University, Nashville, TN, USA;Vanderbilt University, Nashville, TN, USA;Vanderbilt University, Nashville, TN, USA;RTI, Sunnyvale, CA, USA;RTI, Sunnyvale, CA, USA

  • Venue:
  • Proceedings of the 2013 companion publication for conference on Systems, programming, & applications: software for humanity
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Object Management Group's (OMG) Data Distribution Service (DDS) provides many configurable policies which determine end-to-end quality of service (QoS) delivered to the applications. It is challenging, however, to predict the application's performance in terms of latencies, throughput, and resource usage because diverse combinations of QoS configurations influence QoS of applications in different ways. To overcome this problem, design-time formal methods have been applied with mixed success, but a lack of sufficient accuracy in prediction, tool support, and understanding of formalism has prevented wider adoption of the formal techniques. A promising approach to address this challenge is to emulate application behavior and gather data on the QoS parameters of interest by experimentation. To realize this approach, we have developed a middleware framework that uses model-driven generative mechanisms to automate performance testing of a large number of DDS QoS configuration combinations that can be deployed and tested on a cloud platform.