Neural Networks
On the Generalization Ability of Neural Network Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Probabilistic Neural Networks with Rotated Kernel Functions
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Gradient-Based Adaptation of General Gaussian Kernels
Neural Computation
Computers in Biology and Medicine
Classification method using fuzzy level set subgrouping
Expert Systems with Applications: An International Journal
Variations of the two-spiral task
Connection Science
On detecting nonlinear patterns in discriminant problems
Information Sciences: an International Journal
Supervised Learning Probabilistic Neural Networks
Neural Processing Letters
Hi-index | 0.00 |
This article proposes a procedure for the automatic determination of the elements of the covariance matrix of the gaussian kernel function of probabilistic neural networks. Two matrices, a rotation matrix and a matrix of variances, can be calculated by analyzing the local environment of each training pattern. The combination of them will form the covariance matrix of each training pattern. This automation has two advantages: First, it will free the neural network designer from indicating the complete covariance matrix, and second, it will result in a network with better generalization ability than the original model. A variation of the famous two-spiral problem and real-world examples from the UCI Machine Learning Repository will show a classification rate not only better than the original probabilistic neural network but also that this model can outperform other well-known classification techniques.