2D spiral pattern recognition with possibilistic measures
Pattern Recognition Letters
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminant Waveletfaces and Nearest Feature Classifiers for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast k-Nearest Neighbor Classification Using Cluster-Based Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive Quasiconformal Kernel Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nearest Neighbors by Neighborhood Counting
IEEE Transactions on Pattern Analysis and Machine Intelligence
A local mean-based nonparametric classifier
Pattern Recognition Letters
SVM-KNN: Discriminative Nearest Neighbor Classification for Visual Category Recognition
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Improving nearest neighbor classification with cam weighted distance
Pattern Recognition
Rapid and brief communication: Center-based nearest neighbor classifier
Pattern Recognition
Letters: Adaptive local hyperplane classification
Neurocomputing
Kernel K-Local Hyperplanes for Predicting Protein-Protein Interactions
ICNC '08 Proceedings of the 2008 Fourth International Conference on Natural Computation - Volume 05
Data gravitation based classification
Information Sciences: an International Journal
Using graph algebra to optimize neighborhood for isometric mapping
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Tailored Aggregation for Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
An adaptable k-nearest neighbors algorithm for MMSE image interpolation
IEEE Transactions on Image Processing
General Geometric Good Continuation: From Taylor to Laplace via Level Sets
International Journal of Computer Vision
Adaptive local hyperplane for regression tasks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Sensitivity Analysis of k-Fold Cross Validation in Prediction Error Estimation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Extraction of building polygons from SAR images: Grouping and decision-level in the GESTALT system
Pattern Recognition Letters
Scalable large-margin Mahalanobis distance metric learning
IEEE Transactions on Neural Networks
IPADE: iterative prototype adjustment for nearest neighbor classification
IEEE Transactions on Neural Networks
ICMLA '10 Proceedings of the 2010 Ninth International Conference on Machine Learning and Applications
Local Distance Functions: A Taxonomy, New Algorithms, and an Evaluation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning from examples in the small sample case: face expression recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
The Nearest Neighbor Algorithm of Local Probability Centers
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Face recognition using the nearest feature line method
IEEE Transactions on Neural Networks
LDA/SVM driven nearest neighbor classification
IEEE Transactions on Neural Networks
Face Recognition Using Total Margin-Based Adaptive Fuzzy Support Vector Machines
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The k-local hyperplane distance nearest neighbors classification (HKNN) builds a non-linear decision surface with maximal local margin in the input space, with invariance inferred from the local neighborhood rather than the prior knowledge, so that it performs very well in many applications. However, it still cannot be comparable with human being in classification on the noisy, the sparse, and the imbalance data. This paper proposes a new approach,called relative local hyperplane classifier(RLHC),to overcome this problem by utilizing the perceptual relativity to HKNN. It finds k nearest neighbors for the query sample from each class and then performs the relative transformation over all these nearest neighbors to build the relative space. Subsequently, each local hyperplane is constructed in the relative space, which is then applied to perform the classification. Experimental results on both real and simulated data suggest that the proposed approach often gives the better results in classification and robustness.