Connectionist learning procedures
Artificial Intelligence
Introduction to the theory of neural computation
Introduction to the theory of neural computation
A practical Bayesian framework for backpropagation networks
Neural Computation
Statistical Models in S
Kalman filter implementation of self-organizing feature maps
Neural Computation
Bootstrapping to Assess and Improve Atmospheric Prediction Models
Data Mining and Knowledge Discovery
Neural network-based system for early keratoconus detection from corneal topography
Journal of Biomedical Informatics
Semi-supervised learning with an imperfect supervisor
Knowledge and Information Systems
Neural Networks - 2006 special issue: Earth sciences and environmental applications of computational intelligence
Prediction intervals for neural network models
ICCOMP'05 Proceedings of the 9th WSEAS International Conference on Computers
Semi-supervised learning with explicit misclassification modeling
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Analysis of survival data having time-dependent covariates
IEEE Transactions on Neural Networks
Constructing prediction intervals for neural network metamodels of complex systems
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Fast meta-models for local fusion of multiple predictive models
Applied Soft Computing
Computing confidence measures in stochastic logic programs
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Hi-index | 0.01 |
We discuss a number of methods for estimating the standard error of predicted values from a multilayer perceptron. These methods include the delta method based on the Hessian, bootstrap estimators, and the “sandwich” estimator. The methods are described and compared in a number of examples. We find that the bootstrap methods perform best, partly because they capture variability due to the choice of starting weights.