What size net gives valid generalization?
Neural Computation
Generalization Performance of Overtrained Back-Propagation Networks
Proceedings of the EURASIP Workshop 1990 on Neural Networks
PAGER: parameterless, accurate, generic, efficient kNN-based regression
DEXA'10 Proceedings of the 21st international conference on Database and expert systems applications: Part II
Global Artificial Bee Colony-Levenberq-Marquardt GABC-LM Algorithm for Classification
International Journal of Applied Evolutionary Computation
Hi-index | 0.01 |
In the paper three layer perceptron with one hidden layer and the output layer consisting of one neuron is considered. This is commonly used architecture to solve regression problems where such a perceptron minimizing the mean squared error criterion for the data points (x"k, y"k), k = 1,..., N is sought. It is shown that in the model: y"k = g"0(x"k) + @ek, k = 1, ... N, where x"k is independent from zero mean error term @e"k, this procedure is consistent when N - ~, provided that g"0 is represented as three layer perceptron with Heaviside transfer fucntion. The same result is true when transfer function is an arbitrary continuous function with bounded limits at +/-~ and the hidden-to-output weights in the considered family of perceptions are bounded.