Locally linear metric adaptation for semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Kernel Vector Approximation Files for Relevance Feedback Retrieval in Large Image Databases
Multimedia Tools and Applications
Learning a Mahalanobis distance metric for data clustering and classification
Pattern Recognition
International Journal of Knowledge Engineering and Soft Data Paradigms
Hierarchical Multi-view Fisher Discriminant Analysis
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
Kernel-Based metric adaptation with pairwise constraints
ICMLC'05 Proceedings of the 4th international conference on Advances in Machine Learning and Cybernetics
A Semi-Supervised Metric Learning for Content-Based Image Retrieval
International Journal of Computer Vision and Image Processing
Hi-index | 0.00 |
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-of-dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels to compute neighborhoods over which the class probabilities tend to be more homogeneous. As a result, better classification performance can be expected. The efficacy of our method is validated and compared against other competing techniques using a variety of data sets.