WSC '94 Proceedings of the 26th conference on Winter simulation
Five-stage procedure for the evaluation of simulation models through statistical techniques
WSC '96 Proceedings of the 28th conference on Winter simulation
Uniform and bootstrap resampling of empirical distributions
WSC '93 Proceedings of the 25th conference on Winter simulation
Bayesian analysis for simulation input and output
Proceedings of the 29th conference on Winter simulation
Steps to implement Bayesian input distribution selection
Proceedings of the 31st conference on Winter simulation: Simulation---a bridge to the future - Volume 1
Bayesian methods: bayesian methods for simulation
Proceedings of the 32nd conference on Winter simulation
Analysis of simulation experiments by bootstrap resampling
Proceedings of the 33nd conference on Winter simulation
Accounting for input model and parameter uncertainty in simulation
Proceedings of the 33nd conference on Winter simulation
Input uncertainty: accounting for parameter uncertainty in simulation input modeling
Proceedings of the 33nd conference on Winter simulation
Simulation input analysis: joint criterion for factor identification and parameter estimation
Proceedings of the 34th conference on Winter simulation: exploring new frontiers
Input modeling: input model uncertainty: why do we care and what should we do about it?
Proceedings of the 35th conference on Winter simulation: driving innovation
Hi-index | 0.00 |
The need for expressing uncertainty in stochastic simulation systems is widely recognized. However, the emphasis in uncertainty has been directed toward assessing simulation model input parameter uncertainty, while the analysis of simulation output uncertainty is deduced from the input uncertainty. Most recently used methods to assess uncertainty include Delta-Method approaches, Resampling method, Bayesian Analysis method and so on. The problem for all these methods is that the typical simulation user is not particularly proficient in statistics, and so is unlikely to be aware of appropriate sensitivity and/or uncertainty analyses. This suggests the need for a transparent, implementable and efficient method for understanding uncertainty, especially for simulation output uncertainty. In this paper, we propose a simple and straightforward framework to assess stochastic simulation output uncertainty based on Bayesian Melding. We firstly assume the form of probability distribution function of simulation output. We also assume that the final output uncertainty is the weight sum of uncertainty for every simulation output and the weight of each simulation run is proportional to its probability. The advantage of these assumptions is that to describe the simulation output uncertainty in the form of probability distribution function after limited simulation runs, we need only to do two things (1) to estimate parameters in the simulation output probability distribution function and (2) to calculate weight for each simulation. Both of them are discussed in detail in this paper.