Redundancy-controllable adaptive retransmission timeout estimation for packet video

  • Authors:
  • Ali C. Begen;Yucel Altunbasak

  • Affiliations:
  • Georgia Institute of Technology, Atlanta, GA;Georgia Institute of Technology, Atlanta, GA

  • Venue:
  • Proceedings of the 2006 international workshop on Network and operating systems support for digital audio and video
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Time-constrained error recovery is an integral component of reliable low-delay video applications. Regardless of the error-control method adopted by the application, unacknowledged or missing packets must be quickly identified as lost or delayed, so that necessary timely actions can be taken by the server/client. Historically, this problem has been referred to as the retransmission timeout (RTO) estimation. Earlier studies show that existing RTO estimators suffer from either long loss detection times or a large number of pre-mature timeouts. The goal of this study is to address these problems by developing an adaptive RTO estimator for high-bitrate low-delay video applications. By exploiting the temporal dependence between consecutive delay samples, we propose an adaptive linear delay predictor. This way, our RTO estimator configures itself based on the video characteristics and varying network conditions. Our approach also features a controller that optimally manages the trade-off between the amount of overwaiting and redundant retransmission rate. The skeleton implementation shows that the proposed RTO estimator discriminates lost packets from excessively-delayed packets faster and more accurately than its rivals, which consequently enables the applications to recover more packets under stringent delay requirements.