Annals of Operations Research - Special issue on sensitivity analysis and optimization of discrete event systems
The grid
A branch and bound method for stochastic global optimization
Mathematical Programming: Series A and B
Simulation Modeling and Analysis
Simulation Modeling and Analysis
Decomposition Algorithms for Stochastic Programming on a Computational Grid
Computational Optimization and Applications
Variance Reduction and Objective Function Evaluation in Stochastic Linear Programs
INFORMS Journal on Computing
Adaptive Ordering and Pricing for Perishable Products
Operations Research
Jackknife estimators for reducing bias in asset allocation
Proceedings of the 38th conference on Winter simulation
Reformulation and sampling to solve a stochastic network interdiction problem
Networks - Games, Interdiction, and Human Interaction Problems on Networks
Monte Carlo bounding techniques for determining solution quality in stochastic programs
Operations Research Letters
Proceedings of the Winter Simulation Conference
Hi-index | 0.00 |
Stochastic linear programs can be solved approximately by drawing a subset of all possible random scenarios and solving the problem based on this subset, an approach known as sample average approximation (SAA). The value of the objective function at the optimal solution obtained via SAA provides an estimate of the true optimal objective function value. This estimator is known to be optimistically biased; the expected optimal objective function value for the sampled problem is lower (for minimization problems) than the optimal objective function value for the true problem. We investigate how two alternative sampling methods, antithetic variates (AV) and Latin Hypercube (LH) sampling, affect both the bias and variance, and thus the mean squared error (MSE), of this estimator. For a simple example, we analytically express the reductions in bias and variance obtained by these two alternative sampling methods. For eight test problems from the literature, we computationally investigate the impact of these sampling methods on bias and variance. We find that both sampling methods are effective at reducing mean squared error, with Latin Hypercube sampling outperforming antithetic variates. For our analytic example and the eight test problems we derive or estimate the condition number as defined in Shapiro et al. (Math. Program. 94:1---19, 2002). We find that for ill-conditioned problems, bias plays a larger role in MSE, and AV and LH sampling methods are more likely to reduce bias.