Adaptive Newton-based multivariate smoothed functional algorithms for simulation optimization

  • Authors:
  • Shalabh Bhatnagar

  • Affiliations:
  • Indian Institute of Science, Bangalore, India

  • Venue:
  • ACM Transactions on Modeling and Computer Simulation (TOMACS)
  • Year:
  • 2007

Quantified Score

Hi-index 0.02

Visualization

Abstract

In this article, we present three smoothed functional (SF) algorithms for simulation optimization. While one of these estimates only the gradient by using a finite difference approximation with two parallel simulations, the other two are adaptive Newton-based stochastic approximation algorithms that estimate both the gradient and Hessian. One of the Newton-based algorithms uses only one simulation and has a one-sided estimate in both the gradient and Hessian, while the other uses two-sided estimates in both quantities and requires two simulations. For obtaining gradient and Hessian estimates, we perturb each parameter component randomly using independent and identically distributed (i.i.d) Gaussian random variates. The earlier SF algorithms in the literature only estimate the gradient of the objective function. Using similar techniques, we derive two unbiased SF-based estimators for the Hessian and develop suitable three-timescale stochastic approximation procedures for simulation optimization. We present a detailed convergence analysis of our algorithms and show numerical experiments with parameters of dimension 50 on a setting involving a network of M/G/1 queues with feedback. We compare the performance of our algorithms with related algorithms in the literature. While our two-simulation Newton-based algorithm shows the best results overall, our one-simulation algorithm shows better performance compared to other one-simulation algorithms.