Multiple comparison procedures
Multiple comparison procedures
An Experimental and Theoretical Comparison of Model SelectionMethods
Machine Learning - Special issue on the eighth annual conference on computational learning theory, (COLT '95)
Bayesian nonlinear model selection and neural networks: a conjugate prior approach
IEEE Transactions on Neural Networks
Neural network architecture selection: can function complexity help?
Neural Processing Letters
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Hi-index | 0.00 |
One of the main research concern in neural networks is to find the appropriate network size in order to minimize the trade-off between overfitting and poor approximation. In this paper the choice among different competing models that fit to the same data set is faced when statistical methods for model comparison are applied. The study has been conducted to find a range of models that can work all the same as the cost of complexity varies. If they do not, then the generalization error estimates should be about the same among the set of models. If they do, then the estimates should be different and our job would consist on analyzing pairwise differences between the least generalization error estimate and each one of the range, in order to bound the set of models which might result in an equal performance. This method is illustrated applied to polynomial regression and RBF neural networks.