Multilayer feedforward networks are universal approximators
Neural Networks
Neural Networks
Universal approximation using radial-basis-function networks
Neural Computation
Approximation and radial-basis-function networks
Neural Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Approximation and Estimation Bounds for Artificial Neural Networks
Machine Learning - Special issue on computational learning theory
Regularization theory and neural networks architectures
Neural Computation
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Estimation of Dependences Based on Empirical Data: Empirical Inference Science (Information Science and Statistics)
Fast learning in networks of locally-tuned processing units
Neural Computation
Nonparametric regression estimation by normalized radial basis function networks
IEEE Transactions on Information Theory
Nonparametric estimation via empirical risk minimization
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
Radial basis function networks and complexity regularization in function learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We apply normalized RBF networks to the problem of learning nonlinear regression functions. The parameters of the networks are learned by empirical risk minimization and complexity regularization. We study convergence of the RBF networks for various radial kernels as the number of training samples increases. The rates of convergence are also examined.