Multilayer feedforward networks are universal approximators
Neural Networks
Approximation by superposition of sigmoidal and radial basis functions
Advances in Applied Mathematics
Approximation by ridge functions and neural networks with one hidden layer
Journal of Approximation Theory
Advances in Applied Mathematics
Uniform approximation by neural networks
Journal of Approximation Theory
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Approximation bounds for smooth functions in C(Rd) by neural and mixture networks
IEEE Transactions on Neural Networks
Smooth function approximation using neural networks
IEEE Transactions on Neural Networks
Multivariate hyperbolic tangent neural network approximation
Computers & Mathematics with Applications
Multivariate sigmoidal neural network approximation
Neural Networks
Univariate hyperbolic tangent neural network approximation
Mathematical and Computer Modelling: An International Journal
Fractional neural network approximation
Computers & Mathematics with Applications
Hi-index | 0.09 |
The aim of this paper is to investigate the error which results from the method of approximation operators with logarithmic sigmoidal function. By means of the method of extending functions, a class of feed-forward neural network operators is introduced. Using these operators as approximation tools, the upper bounds of errors, in uniform norm, approximating continuous functions, are estimated. Also, a class of quasi-interpolation operators with logarithmic sigmoidal function is constructed for approximating continuous functions defined on the total real axis.