Online learning in radial basis function networks
Neural Computation
Sample Complexity for Function Learning Tasks through Linear Neural Networks
MICAI '02 Proceedings of the Second Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
Knowledge transfer using bayesian belief network
NN'06 Proceedings of the 7th WSEAS International Conference on Neural Networks
Hi-index | 0.00 |
In this paper, we bound the generalization error of a class of Radial Basis Function networks, for certain well defined function learning tasks, in terms of the number of parameters and number of examples. We show that the total generalization error is partly due to the insufficient representational capacity of the network (because of its finite size) and partly due to insufficient information about the target function (because of finite number of samples). We make several observations about generalization error which are valid irrespective of the approximation scheme. Our result also sheds light on ways to choose an appropriate network architecture for a particular problem.