Multilayer feedforward networks are universal approximators
Neural Networks
Neural Networks
Universal approximation using radial-basis-function networks
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Fast learning in networks of locally-tuned processing units
Neural Computation
Nonparametric regression estimation by normalized radial basis function networks
IEEE Transactions on Information Theory
Nonparametric estimation via empirical risk minimization
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
Radial basis function networks and complexity regularization in function learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We study strong universal consistency and the rates of convergence of nonlinear regression function learning algorithms using normalized radial basis function networks. The parameters of the network including centers, covariance matrices and synaptic weights are trained by the empirical risk minimization. We show the rates of convergence for the networks whose parameters are learned by the complexity regularization.