Modeling Communication Delays in Distributed Systems Using Time Series

  • Authors:
  • Raul Ceretta Nunes;Ingrid Jansch-Pôrto

  • Affiliations:
  • -;-

  • Venue:
  • SRDS '02 Proceedings of the 21st IEEE Symposium on Reliable Distributed Systems
  • Year:
  • 2002

Quantified Score

Hi-index 0.01

Visualization

Abstract

The design of dependable distributed applications is a hard task, mainly because of the indefinable statistic behavior of the communication delays. Despite this feature, in practice, most of system monitors make use of timeouts (a maximum waiting time) to ensure some termination properties in their protocols. To have better results, some monitors dynamically predict new timeout values based on observed communication delays to improve the performance and accuracy of the protocols. In the last years, time series theory emerged in the computer science area as a good tool to increase the prediction accuracy. The time series mathematical model is used to describe a sequence of observations of a stochastic process taken periodically in time.This paper shows how to model the round trip communication delay observed by a pull monitor (a periodic requester source), despite its non periodic answers. As a result, this paper shows that the round trip communication delay pattern can be properly represented as a time series. This time series is an alternative model to explore new time-out predictors in distributed system.