Numerical recipes in C: the art of scientific computing
Numerical recipes in C: the art of scientific computing
Neural networks and the bias/variance dilemma
Neural Computation
A practical Bayesian framework for backpropagation networks
Neural Computation
Algorithmic stability and sanity-check bounds for leave-one-out cross-validation
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Model selection in neural networks
Neural Networks
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Ranking a random feature for variable and feature selection
The Journal of Machine Learning Research
Jacobian Conditioning Analysis for Model Validation
Neural Computation
A SOM-based data mining strategy for adaptive modelling of an offset lithographic printing process
Engineering Applications of Artificial Intelligence
Environmental Modelling & Software
Ink flow control by multiple models in an offset lithographic printing process
Computers and Industrial Engineering
A hybrid approach to outlier detection in the offset lithographic printing process
Engineering Applications of Artificial Intelligence
Producing pattern examples from "mental" images
Neurocomputing
Probability density estimation with tunable kernels using orthogonal forward regression
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on gait analysis
UC'06 Proceedings of the 5th international conference on Unconventional Computation
Selecting variables for neural network committees
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A method for detection of Alzheimer's disease using ICA-enhanced EEG measurements
Artificial Intelligence in Medicine
Hi-index | 0.00 |
We present a novel approach to dealing with overfitting in black box models. It is based on the leverages of the samples, that is, on the influence that each observation has on the parameters of the model. Since overfitting is the consequence of the model specializing on specific data points during training, we present a selection method for nonlinear models based on the estimation of leverages and confidence intervals. It allows both the selection among various models of equivalent complexities corresponding to different minima of the cost function (e.g., neural nets with the same number of hidden units) and the selection among models having different complexities (e.g., neural nets with different numbers of hidden units). A complete model selection methodology is derived.