Machine Learning
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
Making large-scale support vector machine learning practical
Advances in kernel methods
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Kernel Nearest-Neighbor Algorithm
Neural Processing Letters
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Approximate Nearest Neighbor Algorithms for Hausdorff Metrics via Embeddings
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Reference Set Thinning for the k-Nearest Neighbor Decision Rule
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
Nearest prototype classification: clustering, genetic algorithms, or random search?
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Fast minimization of structural risk by nearest neighbor rule
IEEE Transactions on Neural Networks
A novel and quick SVM-based multi-class classifier
Pattern Recognition
Quasi-supervised learning for biomedical data analysis
Pattern Recognition
A comparative study on machine learning techniques for prediction of success of dental implants
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Hi-index | 0.10 |
Support vector machines (SVMs) are by far the most sophisticated and powerful classifiers available today. However, this robustness and novelty in approach come at a large computational cost. On the other hand, nearest neighbor (NN) classifiers provide a simple yet robust approach that is guaranteed to converge to a result. In this paper, we present a technique that combines these two classifiers by adopting a NN rule-based structural risk minimization classifier. Using synthetic and real data, the classification technique is shown to be more robust to kernel conditions with a significantly lower computational cost than conventional SVMs. Consequently, the proposed method provides a powerful alternative to SVMs in applications where computation time and accuracy are of prime importance. Experimental results indicate that the NNSRM formulation is not only computationally less expensive, but also much more robust to varying data representations than SVMs.