Leave-One-Out Procedures for Nonparametric Error Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Small Sample Error Rate Estimation for k-NN Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the Behavior of Artificial Neural Network Classifiers in High-Dimensional Spaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Orthogonal Moment Features for Use With Parametric and Non-Parametric Classifiers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Bayes-Optimal Linear Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
A New Approximation Method of the Quadratic Discriminant Function
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Pattern Recognition Letters
A discriminant analysis using composite features for classification problems
Pattern Recognition
On Classifying Disease-Induced Patterns in the Brain Using Diffusion Tensor Images
MICCAI '08 Proceedings of the 11th international conference on Medical Image Computing and Computer-Assisted Intervention - Part I
Expert Systems with Applications: An International Journal
IEICE - Transactions on Information and Systems
A class discriminality measure based on feature space partitioning
Pattern Recognition
Overlap pattern synthesis with an efficient nearest neighbor classifier
Pattern Recognition
Pattern classification using composite features
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Hi-index | 0.15 |
The use of k nearest neighbor (k-NN) and Parzen density estimates to obtain estimates of the Bayes error is investigated under limited design set conditions. By drawing analogies between the k-NN and Parzen procedures, new procedures are suggested, and experimental results are given which indicate that these procedures yield a significant improvement over the conventional k-NN and Parzen procedures. We show that, by varying the decision threshold, many of the biases associated with the k-NN or Parzen density estimates may be compensated, and successful error estimation may be performed in spite of these biases. Experimental results are given which demonstrate the effect of kernel size and shape (Parzen), the size of k (k-NN), and the number of samples in the design set.