Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Associating kNN and SVM for higher classification accuracy
CIS'05 Proceedings of the 2005 international conference on Computational Intelligence and Security - Volume Part I
Consistency of support vector machines and other regularized kernel classifiers
IEEE Transactions on Information Theory
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The paper proposes a model merging a non-parametric k-nearest-neighbor (kNN) method into an underlying support vector machine (SVM) to produce an instance-dependent loss function. In this model, a filtering stage of the kNN searching was employed to collect information from training examples and produced a set of emphasized weights which can be distributed to every example by a class of real-valued class labels. The emphasized weights changed the policy of the equal-valued impacts of the training examples and permitted a more efficient way to utilize the information behind the training examples with various significance levels. Due to the property of estimating density locally, the kNN method has the advantage to distinguish the heterogeneous examples from the regular examples by merely considering the situation of the examples themselves. The paper shows the model is promising with both the theoretical derivations and consequent experimental results.