Ordinal optimization: a nonparametric framework

  • Authors:
  • Peter W. Glynn;Sandeep Juneja

  • Affiliations:
  • Stanford University, Stanford, CA;School of Technology and Computer Science, Tata Institute of Fundamental Research, Mumbai, India

  • Venue:
  • Proceedings of the Winter Simulation Conference
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Simulation-based ordinal optimization has frequently relied on large deviations analysis as a theoretical device for arguing that it is computationally easier to identify the best system out of d alternatives than to estimate the actual performance of a given design. In this paper, we argue that practical implementation of these large deviations-based methods need to estimate the underlying large deviations rate functions of the competing designs from the samples generated. Because such rate functions are difficult to estimate accurately (due to the heavy tails that naturally arise in this setting), the probability of mis-estimation will generally dominate the underlying large deviations probability, making it difficult to build reliable algorithms that are supported theoretically through large deviations analysis. However, when we justify ordinal optimization algorithms on the basis of guaranteed finite sample bounds (as can be done when the associated random variables are bounded), we show that satisfactory and practically implementable algorithms can be designed.