A Quantization Approximation for Modeling Computer Network Nodal Queueing Delay

  • Authors:
  • C. A. Niznik

  • Affiliations:
  • Department of Electrical Engineering, University of-Pittsburgh

  • Venue:
  • IEEE Transactions on Computers
  • Year:
  • 1983

Quantified Score

Hi-index 14.99

Visualization

Abstract

A new approximation model for the analysis of a finite buffer GI/G/1 system is presented. The approach consists of formulating the computer node mean waiting time from a discrete time marginal overflow customer per time slot solution of a continuous marginal overflow time per customer solution. The key to the model solution is the quantization of the distribution (fu(u)) of the difference between customer service time and customer interarrival time to obtain areas of sections (quantiles) of this probability density function. These quantiles represent the entries of the probability transition matrix of the change in the number of customers allowed in the queue. Irreducible Markov chains represent these uniform quantization lower and upper bound steady state buffer occupancy solutions for the number of customers in the queue at time slot j. A guideline for selecting the optimal quantization interval width is the numerical relation observed between the optimal range of Peak Measurement Accuracy and Peak Measurement Complexity for finite areas of fu(u).