Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning to extract symbolic knowledge from the World Wide Web
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Data Compression and Local Metrics for Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
A class-dependent weighted dissimilarity measure for nearest neighbor classification problems
Pattern Recognition Letters
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Examining Locally Varying Weights for Nearest Neighbor Algorithms
ICCBR '97 Proceedings of the Second International Conference on Case-Based Reasoning Research and Development
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Adaptive Quasiconformal Kernel Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Supervised locally linear embedding
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
The Bayes Decision Rule Induced Similarity Measures
IEEE Transactions on Pattern Analysis and Machine Intelligence
An empirical analysis of the probabilistic K-nearest neighbour classifier
Pattern Recognition Letters
Locally linear reconstruction for instance-based learning
Pattern Recognition
Improving Performance of a Binary Classifier by Training Set Selection
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Correlation Integral Decomposition for Classification
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Scaling Up a Metric Learning Algorithm for Image Recognition and Representation
ISVC '08 Proceedings of the 4th International Symposium on Advances in Visual Computing, Part II
A method of learning weighted similarity function to improve the performance of nearest neighbor
Information Sciences: an International Journal
A Random Extension for Discriminative Dimensionality Reduction and Metric Learning
IbPRIA '09 Proceedings of the 4th Iberian Conference on Pattern Recognition and Image Analysis
InstanceRank: Bringing order to datasets
Pattern Recognition Letters
Probability-Based Distance Function for Distance-Based Classifiers
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Kernel Difference-Weighted k-Nearest Neighbors Classification
ICIC '07 Proceedings of the 3rd International Conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence
A new fuzzy rule-based initialization method for K-nearest neighbor classifier
FUZZ-IEEE'09 Proceedings of the 18th international conference on Fuzzy Systems
Journal of Computational and Applied Mathematics
Creating diverse nearest-neighbour ensembles using simultaneous metaheuristic feature selection
Pattern Recognition Letters
Genetic algorithms for automatic classification of moving objects
Proceedings of the 12th annual conference companion on Genetic and evolutionary computation
Weighted nearest neighbor classification via maximizing classification consistency
RSCTC'10 Proceedings of the 7th international conference on Rough sets and current trends in computing
Dimensionality reduction by minimizing nearest-neighbor classification error
Pattern Recognition Letters
Multi-class leveraged k-NN for image classification
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part III
A new co-training-style random forest for computer aided diagnosis
Journal of Intelligent Information Systems
IbPRIA'11 Proceedings of the 5th Iberian conference on Pattern recognition and image analysis
Class confidence weighted kNN algorithms for imbalanced data sets
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Genetic algorithms for automatic object movement classification
ICHIT'11 Proceedings of the 5th international conference on Convergence and hybrid information technology
A novel prototype reduction method for the K-nearest neighbor algorithm with K≥1
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Review: Supervised classification and mathematical optimization
Computers and Operations Research
A new relational Tri-training system with adaptive data editing for inductive logic programming
Knowledge-Based Systems
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
On the evolutionary optimization of k-NN by label-dependent feature weighting
Pattern Recognition Letters
Prototype reduction based on Direct Weighted Pruning
Pattern Recognition Letters
Linear reconstruction measure steered nearest neighbor classification framework
Pattern Recognition
Fuzzy nearest neighbor algorithms: Taxonomy, experimental analysis and prospects
Information Sciences: an International Journal
Hi-index | 0.14 |
In order to optimize the accuracy of the Nearest-Neighbor classification rule, a weighted distance is proposed, along with algorithms to automatically learn the corresponding weights. These weights may be specific for each class and feature, for each individual prototype, or for both. The learning algorithms are derived by (approximately) minimizing the Leaving-One-Out classification error of the given training set. The proposed approach is assessed through a series of experiments with uci/statlog corpora, as well as with a more specific task of text classification which entails very sparse data representation and huge dimensionality. In all these experiments, the proposed approach shows a uniformly good behavior, with results comparable to or better than state-of-the-art results published with the same data so far.