The multi-class metric problem in nearest neighbour discrimination rules
Pattern Recognition
Similarity metric learning for a variable-kernel classifier
Neural Computation
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Restricted Bayes Optimal Classifiers
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive quasiconformal kernel discriminant analysis
Neurocomputing
An MRF-based kernel method for nonlinear feature extraction
Image and Vision Computing
Locally centralizing samples for nearest neighbors
PRICAI'10 Proceedings of the 11th Pacific Rim international conference on Trends in artificial intelligence
Similarity measurement between images
COMPSAC-W'05 Proceedings of the 29th annual international conference on Computer software and applications conference
IbPRIA'11 Proceedings of the 5th Iberian conference on Pattern recognition and image analysis
Cluster-based adaptive metric classification
Neurocomputing
A novel prototype reduction method for the K-nearest neighbor algorithm with K≥1
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Perceptual relativity-based local hyperplane classification
Neurocomputing
Class imbalance and the curse of minority hubs
Knowledge-Based Systems
Hi-index | 0.14 |
Abstract--Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.