Universal approximation using radial-basis-function networks
Neural Computation
Introduction to artificial neural systems
Introduction to artificial neural systems
SIAM Journal on Scientific Computing
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Qualitative Analysis and Synthesis of Recurrent Neural Networks
Qualitative Analysis and Synthesis of Recurrent Neural Networks
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Superposition of radial basis functions centered at given prototype patterns constitutes one of the most suitable energy forms for gradient systems that perform nearest neighbor classification with real-valued static prototypes. It has been shown in [1] that a continuous-time dynamical neural network model, employing a radial basis function and a sigmoid multi-layer perceptron sub-networks, is capable of maximizing such an energy form locally, thus performing almost perfectly nearest neighbor classification, when initiated by a distorted pattern. This paper reviews the proposed design procedure and presents the results of the intensive experimentation of the classifier on random prototypes.