The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support vector machines: hype or hallelujah?
ACM SIGKDD Explorations Newsletter - Special issue on “Scalable data mining algorithms”
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Kernel Nearest-Neighbor Algorithm
Neural Processing Letters
When Is ''Nearest Neighbor'' Meaningful?
ICDT '99 Proceedings of the 7th International Conference on Database Theory
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Tree induction vs. logistic regression: a learning-curve analysis
The Journal of Machine Learning Research
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Convex Optimization
Distance--Based Classification with Lipschitz Functions
The Journal of Machine Learning Research
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Large margin nearest neighbor classifiers
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Consider the classification task of assigning a test instance to one of two or more possible classes. An intuitive way to proceed is to assign the instance to that class, to which the distance is minimal. If one considers the distance to the convex hull of a class as a distance measure, then the resulting classification method is the Nearest Convex Hull (NCH) classifier. There are two key issues with this method per se that severely restrict its applicability, which we solve in this paper: first, how to handle class overlap, and second, how to provide (nonlinear) solutions with better generalization ability. The first problem is handled via using so-called kernel functions and slack variables. The second problem is dealt with using a penalization term that suppresses too complex solutions. We call the resulting method the soft-NCH classifier. In spirit and computationally the method is close to the popular Support Vector Machine (SVM) classifier and can be viewed as an instance-based large-margin classification technique. Advantages of the soft-NCH classifier include its robustness to outliers, good generalization ability and naturally easy handling of multi-class problems. We compare the performance of soft-NCH against state-of-art techniques and report promising results.