A theoretical characterization of linear SVM-based feature selection
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Improving nearest neighbor classification with cam weighted distance
Pattern Recognition
Shape reasoning on mis-segmented and mis-labeled objects using approximated Fisher criterion
Computers and Graphics
Adaptive nearest neighbor classifier based on supervised ellipsoid clustering
FSKD'06 Proceedings of the Third international conference on Fuzzy Systems and Knowledge Discovery
A novel Gabor-LDA based face recognition method
PCM'04 Proceedings of the 5th Pacific Rim conference on Advances in Multimedia Information Processing - Volume Part I
Cluster-based adaptive metric classification
Neurocomputing
Expert Systems with Applications: An International Journal
Perceptual relativity-based local hyperplane classification
Neurocomputing
Adaptive distance metrics for nearest neighbour classification based on genetic programming
EuroGP'13 Proceedings of the 16th European conference on Genetic Programming
Hi-index | 0.00 |
Nearest neighbor (NN) classification relies on the assumption that class conditional probabilities are locally constant. This assumption becomes false in high dimensions with finite samples due to the curse of dimensionality. The NN rule introduces severe bias under these conditions. We propose a locally adaptive neighborhood morphing classification method to try to minimize bias. We use local support vector machine learning to estimate an effective metric for producing neighborhoods that are elongated along less discriminant feature dimensions and constricted along most discriminant ones. As a result, the class conditional probabilities can be expected to be approximately constant in the modified neighborhoods, whereby better classification performance can be achieved. The efficacy of our method is validated and compared against other competing techniques using a number of datasets.