Neural Networks
Using Literal and Grammatical Statistics for Authorship Attribution
Problems of Information Transmission
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
A survey of modern authorship attribution methods
Journal of the American Society for Information Science and Technology
RSFDGrC'11 Proceedings of the 13th international conference on Rough sets, fuzzy sets, data mining and granular computing
Directed enumeration method in image recognition
Pattern Recognition
Asymptotically optimal discriminant functions for pattern classification
IEEE Transactions on Information Theory
Asymptotically optimal pattern recognition procedures with density estimates (Corresp.)
IEEE Transactions on Information Theory
Probabilistic neural-network structure determination for pattern classification
IEEE Transactions on Neural Networks
Adaptive probabilistic neural networks for pattern classification in time-varying environment
IEEE Transactions on Neural Networks
A general regression neural network
IEEE Transactions on Neural Networks
Real-time image recognition with the parallel directed enumeration method
ICVS'13 Proceedings of the 9th international conference on Computer Vision Systems
Hi-index | 0.00 |
Since the works by Specht, the probabilistic neural networks (PNNs) have attracted researchers due to their ability to increase training speed and their equivalence to the optimal Bayesian decision of classification task. However, it is known that the PNN's conventional implementation is not optimal in statistical recognition of a set of patterns. In this article we present the novel modification of the PNN and prove that it is optimal in this task with general assumptions of the Bayes classifier. The modification is based on a reduction of recognition task to homogeneity testing problem. In the experiment we examine a problem of authorship attribution of Russian texts. Our results support the statement that the proposed network provides better accuracy and is much more resistant to change the smoothing parameter of Gaussian kernel function in comparison with the original PNN.