Computing with arrays of bell-shaped and sigmoid functions
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Kolmogorov's theorem and multilayer neural networks
Neural Networks
Kolmogorov's theorem is relevant
Neural Computation
Neural Network Research Progress and Applications in Forecast
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks, Part II
Essential rate for approximation by spherical neural networks
Neural Networks
Hi-index | 0.00 |
The general approximation problem of interest to the area of feedforward neural networks is stated. Solutions for some special cases are given, which include an upper bound on the number of nodes in hidden layer(s) and the weights for that configuration. Analytical solutions to the general feedforward neural network problem in one-dimensional cases requiring an infinite number of nodes are presented. The practical solutions (not requiring an infinite number of nodes) in one-dimensional cases are derived under piecewise constant approximations with constant width partitions, under piecewise constant approximations with variable width partitions, and under piecewise linear approximations using ramps instead of sigmoids. A four layer solution to the general feedforward neural network problem in the n-dimensional case is presented. A three layer solution to the general feedforward neural network problem in the n-dimensional case with piecewise constant approximation requires the use of the corner function as the activation function. The corner function, a special case of n dimensional sigmoid function, is found to have desirable characteristics, and can be used to approximate functions with much weaker requirements (only boundedness and piecewise continuity. ) Concave regions can be formed with a single layer of nodes with the corner function.