A versatile queueing model for data switching

  • Authors:
  • Richard V. Laue

  • Affiliations:
  • -

  • Venue:
  • SIGCOMM '81 Proceedings of the seventh symposium on Data communications
  • Year:
  • 1981

Quantified Score

Hi-index 0.00

Visualization

Abstract

In a data network, when messages arrive at a switch to be served (transmitted) on a line, it seems reasonable to assume that the arrival process can be described as a Poisson (random) process. However, when messages are divided into a number of packets of a maximum length, these packets arrive bunched together. This gives rise to what is referred to as “peaked” traffic. The degree of peakedness depends on 1) the interarrival time of packets associated with a particular message and 2) the distribution of the number of packets per message. In this paper we describe a queueing model which accounts for the non-Poissonian nature of the packet arrival process as a function of these two factors. Since packets are of a fixed maximum length, the model assumes that the packet service time is constant, as opposed to the mathematically more tractable but less realistic assumption of exponentially-distributed service time. This queueing model is then used to describe the network delay as affected by: 1. Message switching versus packet switching, 2. A priority discipline in the queues, 3. Packet interarrival time per message, which is probably controlled by the line speed at the packet origination point, and 4. A network which carries only short inquiry-response traffic as opposed to a network which also carries longer low-priority printer traffic. The general conclusions are that the peakedness in the arrival process caused by a short interarrival time of packets per message and the longer printer traffic would cause excessive delays in a network. If inquiry-response traffic with a short response-time requirement is also to be carried on the same network a priority discipline has considerable value. Message switching for such a combination of traffic should be avoided.