Multilayer feedforward networks are universal approximators
Neural Networks
Intelligent locomotion control on sloping surfaces
Information Sciences—Informatics and Computer Science: An International Journal
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
GPS orbit approximation using radial basis function networks
Computers & Geosciences
Equivalent Relationship of Feedforward Neural Networks and Real-Time Face Detection System
Proceedings of the FIRA RoboWorld Congress 2009 on Advances in Robotics
Some new results on neural network approximation
Neural Networks
Effects of using different neural network structures and cost functions in locomotion control
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part I
Approximation bound for fuzzy-neural networks with bell membership function
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part I
ADMA'05 Proceedings of the First international conference on Advanced Data Mining and Applications
Hi-index | 0.00 |
Let f be an arbitrary continuous function defined on R^d and h be any sigmoid function. Then, there is a linear combination of scaled shifted rotations of h, which approximates f uniformly on the whole space R^d, if a certain such linear combination approximates f uniformly in a neighbourhood of the infinite point. From this result, derived is that any function continuous on R@?^d (the one-point compactification of R^d) can be likewise approximated. Further, obtained is a necessary and sufficient condition on h, under which the uniform approximation can be implemented without scaling of h. The Heaviside function, the logistic function, the Gaussian distribution function, and the arctangent sigmoid function satisfy this condition. Any sigmoid function that increases only in a finite interval satisfies it. Examples of sigmoid functions incapable of implementing the uniform approximation without scaling are also illustrated.