A new mathematical model for optimizing the performance of parallel and discrete event simulation systems

  • Authors:
  • Syed S. Rizvi;Khaled. M. Elleithy;Aasia Riasat

  • Affiliations:
  • University of Bridgeport, Bridgeport, CT;University of Bridgeport, Bridgeport, CT;Institute of Business Management, Karachi, Pakistan

  • Venue:
  • Proceedings of the 2008 Spring simulation multiconference
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Null message algorithm is an important conservative time management protocol in parallel discrete event simulation systems for providing synchronization between the distributed computers with the capability of both avoiding and resolving the deadlock. However, the excessive generation of null messages prevents the widespread use of this algorithm. The excessive generation of null messages results due to an improper use of some of the critical parameters such as frequency of transmission and Lookahead values. However, if we could minimize the generation of null messages, most of the parallel discrete event simulation systems would be likely to take advantage of this algorithm in order to gain increased system throughput and minimum transmission delays. In this paper, a new mathematical model for optimizing the performance of parallel and distributed simulation systems is proposed. The proposed mathematical model utilizes various optimization techniques such as variance of null message elimination to improve the performance of parallel and distributed simulation systems. For the sake of simulation results, we consider both uniform and non-uniform distribution of Lookahead values across multiple output lines of an LP. Our experimental verifications demonstrate that an optimal NMA offers better scalability in parallel discrete event simulation systems if it is used with the proper selection of critical parameters.