System identification (2nd ed.): theory for the user
System identification (2nd ed.): theory for the user
Evaluating the Generalization Ability of Support Vector Machines through the Bootstrap
Neural Processing Letters
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
On the Kernel Widths in Radial-Basis Function Networks
Neural Processing Letters
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Bootstrap for model selection: linear approximation of the optimism
IWANN'03 Proceedings of the Artificial and natural neural networks 7th international conference on Computational methods in neural modeling - Volume 1
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Training multilayer perceptron classifiers based on a modified support vector method
IEEE Transactions on Neural Networks
A prediction interval-based approach to determine optimal structures of neural network metamodels
Expert Systems with Applications: An International Journal
Balanced bootstrap resampling method for neural model selection
Computers & Mathematics with Applications
Hi-index | 0.01 |
Using resampling methods like cross-validation and bootstrap is a necessity in neural network design, for solving the problem of model structure selection. The bootstrap is a powerful method offering a low variance of the model generalization error estimate. Unfortunately, its computational load may be excessive when used to select among neural networks models of different structures or complexities. This paper presents the fast bootstrap (FB) methodology to select the best model structure; this methodology is applied here to regression tasks. The fast bootstrap assumes that the computationally expensive term estimated by the bootstrap, the optimism, is usually a smooth function (low-order polynomial) of the complexity parameter. Approximating the optimism term makes it possible to considerably reduce the necessary number of simulations. The FB methodology is illustrated on multi-layer perceptrons, radial-basis function networks and least-square support vector machines.