General dynamic routing with per-packet delay guarantees of O(distance+1/session rate)

  • Authors:
  • M. Andrews;A. Fernandez;M. Harchol-Balter;T. Leighton;L. Zhang

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • FOCS '97 Proceedings of the 38th Annual Symposium on Foundations of Computer Science
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

A central issue in the design of modern communication networks is that of providing performance guarantees. This issue is particularly important if the networks support read-time traffic such as voice and video. The most critical performance parameter to bound is the delay experienced by a packet as it travels from its source to its destination. We study dynamic routing in a connection-oriented packet-switching network. We consider a network with arbitrary topology on which a set of sessions is defined. For each session i, packets are injected at a rate r/sub i/ to follow a predetermined path of length d/sub i/. Due to limited bandwidth, only one packet at a time may advance on an edge. Session paths may overlap subject to the constraint that the total rate of sessions using any particular edge is less than 1. We address the problem of scheduling the sessions at each switch, so as to minimize worst-case packet delay and queue buildup at the switches. We show the existence of an asymptotically-optimal schedule that achieves a delay bound of O(1/r/sub i/+d/sub i/) with only constant-size queues at the switches. We also present a simple distributed algorithm that, with high probability, delivers every session-i packet to its destination within O(1/r/sub i/+d/sub i/ log(m/r/sub min/)) steps of its injection, where r/sub min/ is the minimum session rate, and m is the number of edges in the network. Our results can be generalized to (leaky-bucket constrained) bursty traffic, where session i tolerates a burst size of b/sub i/. In this case, our delay bounds become O(b/sub i//r/sub i/+d/sub i/) and O(b/sub i//r/sub i/+d/sub i/ log(m/r/sub min/)), respectively.