Quick simulation of rare events in networks

  • Authors:
  • R. D. Fresnedo

  • Affiliations:
  • -

  • Venue:
  • WSC '89 Proceedings of the 21st conference on Winter simulation
  • Year:
  • 1989

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the problem of how to simulate the occurrence of rare events on networks of queues; an interesting application is to obtain the expected time until the network buffers fill up.We show that the unique optimal (minimum variance) change of measure (importance sampling) to simulate an event is given by the law of the process conditioned on the event (rare or not).Some theory is needed to circumvent the fact that knowledge of the conditional laws implies knowledge of the solution. We present two ways to handle the problem. Boundary theory of Markov Chains provides the theoretical framework. The method sheds light on the way that rare events happen; this in turn explains why some large deviations ("LD" in what follows) heuristics (Walrand and Parekh) fail for important combinations of parameter values (optimal buffer allocation).A compactness argument and a scale-down version of the model are used to simulate successfully the chances of excessive backlogs for many M/M/1 queues in tandem and for any combination of parameters.Alternatively, we can build on the LD heuristics and, using the notion of convex combination of harmonics, we successfully treat the optimal buffer allocation case.