Multilayer feedforward networks are universal approximators
Neural Networks
Approximation by superposition of sigmoidal and radial basis functions
Advances in Applied Mathematics
Approximation by ridge functions and neural networks with one hidden layer
Journal of Approximation Theory
Dimension-independent bounds on the degree of approximation by neural networks
IBM Journal of Research and Development
Advances in Applied Mathematics
Neural networks for optimal approximation of smooth and analytic functions
Neural Computation
Efficient estimation of neural weights by polynomial approximation
IEEE Transactions on Information Theory
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Approximation bounds for smooth functions in C(Rd) by neural and mixture networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
For the nearly exponential type of feedforward neural networks (neFNNs), the essential order of their approximation is revealed. It is proven that for any continuous function defined on a compact set of R^d, there exist three layers of neFNNs with the fixed number of hidden neurons that attain the essential order. Under certain assumption on the neFNNs, the ideal upper bound and lower bound estimations on approximation precision of the neFNNs are provided. The obtained results not only characterize the intrinsic property of approximation of the neFNNs, but also proclaim the implicit relationship between the precision (speed) and the number of hidden neurons of the neFNNs.