On the performance of greedy algorithms in packet buffering

  • Authors:
  • Susanne Albers;Markus Schmidt

  • Affiliations:
  • Albert-Ludwigs-Universitat Freiburg, Freiburg, Germany;Albert-Ludwigs-Universitat Freiburg, Freiburg, Germany

  • Venue:
  • STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study a basic buffer management problem that arises in network switches. Consider m input ports, each of which is equipped with a buffer (queue) of limited capacity. Data packets arrive online and can be stored in the buffers if space permits; otherwise packet loss occurs. In each time step the switch can transmit one packet from one of the buffers to the output port. The goal is to maximize the number of transmitted packets. Simple arguments show that any reasonable algorithm, which serves any non-empty buffer, is 2-competitive. Azar and Richter recently presented a randomized online algorithm and gave lower bounds for deterministic and randomized strategies.In practice greedy algorithms are very important because they are fast, use little extra memory and reduce packet loss by always serving a longest queue. In this paper we first settle the competitive performance of the entire family of greedy strategies. We prove that greedy algorithms are not better than 2-competitive no matter how ties are broken. Our lower bound proof uses a new recursive construction for building adversarial buffer configurations that may be of independent interest. We also give improved lower bounds for deterministic and randomized online algorithms.In the second part of the paper we present the first deterministic online algorithm that is better than 2-competitive. We develop a modified greedy algorithm, called Semi-Greedy, and prove that it achieves a competitive ratio of $17/9 ≅ 1. 89$. The new algorithm is simple, fast and uses little extra memory. Only when the risk of packet loss is low, it does not serve the longest queue. Additionally we study scenarios when an online algorithm is granted additional resources. We consider resource augmentation with respect to memory and speed, i. e. an online algorithm may be given larger buffers or higher transmission rates. We analyze greedy and other online strategies.