Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Regularized neural networks: some convergence rate results
Neural Computation
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
Learning with generalization capability by kernal methods of bounded complexity
Journal of Complexity
Learning methods for radial basis function networks
Future Generation Computer Systems
Hybrid learning of regularization neural networks
ICAISC'10 Proceedings of the 10th international conference on Artifical intelligence and soft computing: Part II
Memetic evolutionary learning for local unit networks
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
Regularization theory presents a sound framework to solving supervised learning problems. However, there is a gap between the theoretical results and practical suitability of regularization networks (RN). Radial basis function networks (RBF) can be seen as a special case of regularization networks with a selection of learning algorithms. We study a relationship between RN and RBF, and experimentally evaluate their approximation and generalization ability with respect to number of hidden units.