Statistical analysis of TCP's retransmission timeout algorithm

  • Authors:
  • Liangping Ma;Kenneth E. Barner;Gonzalo R. Arce

  • Affiliations:
  • San Diego Research Center Inc., San Diego, CA;Department of Electrical and Computer Engineering, University of Delaware, Newark, DE;Department of Electrical and Computer Engineering, University of Delaware, Newark, DE

  • Venue:
  • IEEE/ACM Transactions on Networking (TON)
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The retransmission timeout (RTO) algorithm of Transmission Control Protocol (TCP), which sets a dynamic upper bound on the next round-trip time (RTT) based on past RTTs, plays an important role in reliable data transfer and congestion control of the Internet. A rigorous theoretical analysis of the RTO algorithm is important in that it provides insight into the algorithm and prompts optimal design strategies. Nevertheless, such an analysis has not been conducted to date. This paper presents such an analysis from a statistical approach. We construct an auto-regressive (AR) model for the RTT processes based on experimental results that indicate: 1) RTTs along a certain path in the Internet can be modeled by a shifted Gamma distribution and 2) the temporal correlation of RTTs decreases quickly with lag. This model is used to determine the average reaction time and premature timeout probability for the RTO algorithm. We derive a closed-form expression for the first measure and a formula for numerically calculating the second. Both measures are validated through tests on simulated and real RTT data. The theoretical analysis strengthens a number of observations reported in past experiment-oriented studies.