A loss-event driven scalable fluid simulation method for high-speed networks

  • Authors:
  • Suman Kumar;Seung-Jong Park;S. Sitharama Iyengar

  • Affiliations:
  • Computer Science and Center for Computation & Technology, Louisiana State University, Baton Rouge, LA 70803, United States;Computer Science and Center for Computation & Technology, Louisiana State University, Baton Rouge, LA 70803, United States;Computer Science, Louisiana State University, Baton Rouge, LA 70803, United States

  • Venue:
  • Computer Networks: The International Journal of Computer and Telecommunications Networking
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Increase of size and bandwidth of computer network posed a research challenge to evaluate proposed TCP/IP protocol and corresponding queuing policies in this scenario. Simulation provides an easier and cheaper method to evaluate TCP proposals and queuing disciplines as compared to experiment with real hardware. In this paper, problem associated with scalability of current simulation method for high-speed network case is discussed. Hence, we present a scalable time-adaptive numerical simulation driven by loss events to represent dynamics of high-speed networks using fluid-based models. The new method uses a loss event to dynamically adjust the size of a time step for a numerical solver which solves a system of differential equations representing dynamics of protocols and nodes' behaviors. A numerical analysis of the proposed protocol is discussed. A simple simulation of high-speed TCP variants is presented using our method. The simulation results and analysis show that the time-adaptive method reduces computational time while achieving the same accuracy compared to that of a fixed step-size method.