Generalization Error and Training Error at Singularities of Multilayer Perceptrons
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Hi-index | 0.00 |
In wide-ranging applications of layered neural networks and radial basis function networks, we often encounter the problem of determining the optimal net work size based only on the available data with finite size. One of the techniques to solve this problem is statistical model selection, in which the optimal size is determined according to a model selection criterion. In neural network field, several model selection criteria such as GPE (Generalized Prediction Error) [9] and NIC (Network Information Criterion) [10] have been proposed. These are the generalized criteria of classical FPE (Final Prediction Error) [1] or the most popular criterion AIC (Akaike Information Criterion) [2]. For example, NIC reduces to AIC when a target true function is in a family of networks [10]; i.e. realizable or overrealizable scenario, in which a target is said to be overrealizable if a target is realizable in terms of a family of networks and the family is not minimal, in other words, a network to represent a true function is redundant.