Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Multilayer feedforward networks are universal approximators
Neural Networks
Competitive learning algorithms for vector quantization
Neural Networks
Learning in a competitive network
Neural Networks
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
What Size Test Set Gives Good Error Rate Estimates?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast learning in networks of locally-tuned processing units
Neural Computation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A new type of Multilayer network including certain class of Radial Basis Units (RBU), whose kernels are implemented at the synaptic level, is compared through simulations with the Multi-Layer Perceptron (MLP) in a classification problem with a high interference of class distributions. The simulations show that the new network gives error rates in the classification near those of the Optimum Bayesian Classifier (OBC), while MLP presents an inherent weakness for these classification tasks.