Multilayer feedforward networks are universal approximators
Neural Networks
Introduction to artificial neural systems
Introduction to artificial neural systems
Neural networks and the bias/variance dilemma
Neural Computation
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Feedforward Neural Network Construction Using Cross Validation
Neural Computation
Asymptotic statistical theory of overtraining and cross-validation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Neural network architecture selection: can function complexity help?
Neural Processing Letters
Extension of the generalization complexity measure to real valued input data sets
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
In this paper, a novel and effective criterion based on the estimation of the signal-to-noise-ratio figure (SNRF) is proposed to optimize the number of hidden neurons in neural networks to avoid overfitting in the function approximation. SNRF can quantitatively measure the useful information left unlearned so that overfitting can be automatically detected from the training error only without use of a separate validation set. It is illustrated by optimizing the number of hidden neurons in a multi-layer perceptron (MLP) using benchmark datasets. The criterion can be further utilized in the optimization of other parameters of neural networks when overfitting needs to be considered.