What size net gives valid generalization?
Neural Computation
Fundamentals of neural networks: architectures, algorithms, and applications
Fundamentals of neural networks: architectures, algorithms, and applications
Metamodel applications using TERSM
WSC '95 Proceedings of the 27th conference on Winter simulation
Regression metamodeling in simulation using Bayesian methods
Proceedings of the 31st conference on Winter simulation: Simulation---a bridge to the future - Volume 1
The main issues in nonlinear simulation metamodel estimation
Proceedings of the 31st conference on Winter simulation: Simulation---a bridge to the future - Volume 1
Concurrent Sample Path Analysis of Discrete Event Systems
Discrete Event Dynamic Systems
A tight bound on concept learning
IEEE Transactions on Neural Networks
Grid enabled sequential design and adaptive metamodeling
Proceedings of the 38th conference on Winter simulation
A novel sequential design strategy for global surrogate modeling
Winter Simulation Conference
Techniques for analysis and calibration of multi-agent simulations
ESAW'04 Proceedings of the 5th international conference on Engineering Societies in the Agents World
Proceedings of the Winter Simulation Conference
Hi-index | 0.00 |
Simulation is one of the most powerful tools for modeling and evaluating the performance of complex systems, however, it is computationally slow. One approach to overcome this limitation is to develop a "metamodel". In other words, generate a "surrogate" model of the original system that accurately captures the relationships between input and output, yet it is computationally more efficient than simulation. Neural networks (NN) are known to be good function approximators and thus make good metamodel candidates. During training, a NN is presented with several input/output pairs, and is expected to learn the functional relationship between inputs and outputs of the simulation model. So, a trained net can predict the output for inputs other than the ones presented during training. This ability of NNs to generalize depends on the number of training pairs used. In general, a large number of such pairs is required and, since they are obtained through simulation, the metamodel development is slow. In DES simulation it is often possible to use perturbation analysis to also obtain sensitivity information with respect to various input parameters. In this paper, we investigate the use of sensitivity information to reduce the simulation effort required for training a NN metamodel.