2D spiral pattern recognition with possibilistic measures
Pattern Recognition Letters
A Bootstrap Technique for Nearest Neighbor Classifier Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Automatic Textual Document Categorization Based on Generalized Instance Sets and a Metamodel
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive Quasiconformal Kernel Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nearest Neighbors by Neighborhood Counting
IEEE Transactions on Pattern Analysis and Machine Intelligence
A local mean-based nonparametric classifier
Pattern Recognition Letters
Local relative transformation with application to isometric embedding
Pattern Recognition Letters
Tailored Aggregation for Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Nearest Neighbor Algorithm of Local Probability Centers
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
The k nearest neighbors classifier is simple and often results in good performance in problems. However, it can not work well on noisy and high dimensional data, as the structure composed of selected nearest neighbors on these data is easily deformed and perceptually unstable. This paper presents a locally centralizing samples approach with kernel techniques to preprocess the data. It creates a new sample for each original sample through its neighborhood and then replace it to be candidate for nearest neighbors. This approach can be justified by gestalt psychology and applied to provide better quality data for classifiers, even if the original data is noisy and high dimensional. The conducted experiments on challenging benchmark data sets validate the proposed approach.