Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Approximation capabilities of multilayer feedforward networks
Neural Networks
Convergence rates for single hidden layer feedforward networks
Neural Networks
Extraction of Logical Rules from Data by Means of Piecewise-Linear Neural Networks
DS '02 Proceedings of the 5th International Conference on Discovery Science
Environmental Modelling & Software
Artificial Intelligence Review
Self-organizing multilayer perceptron
IEEE Transactions on Neural Networks
Set Membership identification of nonlinear systems
Automatica (Journal of IFAC)
Hi-index | 0.00 |
Recently Barron (1993) has given rates for hidden layer feedforward networks with sigmoid activation functions approximating a class of functions satisfying a certain smoothness condition. These rates do not depend on the dimension of the input space. We extend Barron's results to feedforward networks with possibly nonsigmoid activation functions approximating mappings and their derivatives simultaneously. Our conditions are similar but not identical to Barron's, but we obtain the same rates of approximation, showing that the approximation error decreases at rates as fast as n-1/2, where n is the number of hidden units. The dimension of the input space appears only in the constants of our bounds.