Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
IEEE Transactions on Neural Networks
Detection of mines and minelike targets using principal component and neural-network methods
IEEE Transactions on Neural Networks
Mixture density modeling, Kullback-Leibler divergence, and differential log-likelihood
Signal Processing - Special issue: Information theoretic signal processing
Improving multiclass pattern recognition with a co-evolutionary RBFNN
Pattern Recognition Letters
A Hybrid Nonlinear-Discriminant Analysis Feature Projection Technique
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
IEEE Transactions on Signal Processing
A cooperative coevolution algorithm of RBFNN for classification
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Dual-population based coevolutionary algorithm for designing RBFNN with feature selection
Expert Systems with Applications: An International Journal
So near and yet so far: New insight into properties of some well-known classifier paradigms
Information Sciences: an International Journal
Financial time series forecast using neural network ensembles
ICSI'11 Proceedings of the Second international conference on Advances in swarm intelligence - Volume Part I
Neyman-Pearson Classification, Convexity and Stochastic Constraints
The Journal of Machine Learning Research
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
A plug-in approach to neyman-pearson classification
The Journal of Machine Learning Research
Hi-index | 0.00 |
We propose a novel technique for the design of radial basis function (RBF) neural networks (NNs). To select various RBF parameters, the class membership information of training samples is utilized to produce new cluster classes. This allows emphasis of classification performance for certain class data rather than best overall classification. This allows us to control performance as desired and to approximate Neyman-Pearson classification. We also show that by properly choosing the desired output neuron levels, then the RBF hidden to output layer performs Fisher discrimination analysis, and that the full system performs a nonlinear Fisher analysis. Data on an agricultural product inspection problem and on synthetic data confirm the effectiveness of these methods.