Multilayer feedforward networks are universal approximators
Neural Networks
Universal approximation using radial-basis-function networks
Neural Computation
Approximation by superposition of sigmoidal and radial basis functions
Advances in Applied Mathematics
Dimension-independent bounds on the degree of approximation by neural networks
IBM Journal of Research and Development
Advances in Applied Mathematics
Regularization theory and neural networks architectures
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Towards robust model selection using estimation and approximation error bounds
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Neural networks for functional approximation and system identification
Neural Computation
Nonparametric Time Series Prediction Through Adaptive ModelSelection
Machine Learning
Neural networks with local receptive fields and superlinear VC Dimension
Neural Computation
On best approximation of classes by radial functions
Journal of Approximation Theory
Radial Basis Function Neural Networks Have Superlinear VC Dimension
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Almost Linear VC-Dimension Bounds for Piecewise Polynomial Networks
Neural Computation
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Essential rate for approximation by spherical neural networks
Neural Networks
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Approximation bound of mixture networks in Lwp spaces
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
ADMA'05 Proceedings of the First international conference on Advanced Data Mining and Applications
Approximation properties of local bases assembled from neural network transfer functions
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
We prove that neural networks with a single hidden layer are capable of providing an optimal order of approximation for functions assumed to possess a given number of derivatives, if the activation function evaluated by each principal element satisfies certain technical conditions. Under these conditions, it is also possible to construct networks that provide a geometric order of approximation for analytic target functions. The permissible activation functions include the squashing function (1-e-x)-1 as well as a variety of radial basis functions. Our proofs are constructive. The weights and thresholds of our networks are chosen independently of the target function; we give explicit formulas for the coefficients as simple, continuous, linear functionals of the target function.