Multilayer feedforward networks are universal approximators
Neural Networks
A Theory of Networks for Approximation and Learning
A Theory of Networks for Approximation and Learning
Implementation of Kolmogorov Learning Algorithm for Feedforward Neural Networks
ICCS '01 Proceedings of the International Conference on Computational Science-Part II
A novel fast Kolmogorov's spline complex network for pattern detection
WSEAS TRANSACTIONS on SYSTEMS
A novel fast Kolmogorov's spline complex network for pattern detection
SMO'08 Proceedings of the 8th conference on Simulation, modelling and optimization
Function Decomposition Network
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Self-organizing multilayer perceptron
IEEE Transactions on Neural Networks
Predictive Kohonen map for speech features extraction
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Expert Systems with Applications: An International Journal
Using kolmogorov inspired gates for low power nanoelectronics
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Hi-index | 0.00 |
Many neural networks can be regarded as attempting to approximate a multivariate function in terms of one-input one-output units. This note considers the problem of an exact representation of nonlinear mappings in terms of simpler functions of fewer variables. We review Kolmogorov's theorem on the representation of functions of several variables in terms of functions of one variable and show that it is irrelevant in the context of networks for learning.