Universal approximation using radial-basis-function networks
Neural Computation
SIAM Journal on Scientific Computing
Introduction to Artificial Neural Systems
Introduction to Artificial Neural Systems
Qualitative Analysis and Synthesis of Recurrent Neural Networks
Qualitative Analysis and Synthesis of Recurrent Neural Networks
A generalized convergence theorem for neural networks
IEEE Transactions on Information Theory - Part 1
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
A new design method for the complex-valued multistate Hopfield associative memory
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Gaussian networks for direct adaptive control
IEEE Transactions on Neural Networks
Weighted locally linear embedding for dimension reduction
Pattern Recognition
Semantic analysis of real-world images using support vector machine
Expert Systems with Applications: An International Journal
High-order associative memories for pattern recognition
ICS'08 Proceedings of the 12th WSEAS international conference on Systems
Measuring effectiveness of a dynamic artificial neural network algorithm for classification problems
Expert Systems with Applications: An International Journal
MLDM'11 Proceedings of the 7th international conference on Machine learning and data mining in pattern recognition
Expert Systems with Applications: An International Journal
Hi-index | 0.01 |
Superposition of radial basis functions centered at given prototype patterns constitutes one of the most suitable energy forms for gradient systems that perform nearest neighbor classification with real-valued static prototypes. It is shown in this paper that a continuous-time dynamical neural network model, employing a radial basis function and a sigmoid multi-layer perceptron sub-networks, is capable of maximizing such an energy form locally, thus performing almost perfectly nearest neighbor classification, when initiated by a distorted pattern. The proposed design scheme allows for explicit representation of prototype patterns as network parameters, as well as augmenting additional or forgetting existing memory patterns. The dynamical classification scheme implemented by the network eliminates all comparisons, which are the vital steps of the conventional nearest neighbor classification process. The performance of the proposed network model is demonstrated on binary and gray-scale image reconstruction applications.