Multilayer feedforward networks are universal approximators
Neural Networks
Universal approximation using radial-basis-function networks
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Neural networks for optimal approximation of smooth and analytic functions
Neural Computation
Hi-index | 0.00 |
We propose a general method for estimating the distance between a compact subspace K of the space L 1([0,1] s ) of Lebesgue measurable functions defined on the hypercube [0,1] s , and the class of functions computed by artificial neural networks using a single hidden layer, each unit evaluating a sigmoidal activation function. Our lower bounds are stated in terms of an invariant that measures the oscillations of functions of the space K around the origin. As an application we estimate the minimal number of neurons required to approximate bounded functions satisfying uniform Lipschitz conditions of order *** with accuracy *** .