Pattern Recognition Letters
Fast condensed nearest neighbor rule
ICML '05 Proceedings of the 22nd international conference on Machine learning
Structural risk minimization using nearest neighbor rule
ICME '03 Proceedings of the 2003 International Conference on Multimedia and Expo - Volume 2
Minimum reference set based feature selection for small sample classifications
Proceedings of the 24th international conference on Machine learning
Distributed Nearest Neighbor-Based Condensation of Very Large Data Sets
IEEE Transactions on Knowledge and Data Engineering
Fast Nearest Neighbor Condensation for Large Data Sets Classification
IEEE Transactions on Knowledge and Data Engineering
A grid-based architecture for nearest neighbor based condensation of huge datasets
UPGRADE '08 Proceedings of the third international workshop on Use of P2P, grid and agents for the development of content networks
Scaling up support vector machines using nearest neighbor condensation
IEEE Transactions on Neural Networks
Quasi-supervised learning for biomedical data analysis
Pattern Recognition
Condensed nearest neighbor data domain description
IDA'05 Proceedings of the 6th international conference on Advances in Intelligent Data Analysis
A comparative study on machine learning techniques for prediction of success of dental implants
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Modified blame-based noise reduction for concept drift
AIKED'12 Proceedings of the 11th WSEAS international conference on Artificial Intelligence, Knowledge Engineering and Data Bases
Cellular automata based nearest neighbour rule condensation
International Journal of Computer Applications in Technology
Efficient distributed data condensation for nearest neighbor classification
Euro-Par'07 Proceedings of the 13th international Euro-Par conference on Parallel Processing
Hi-index | 0.00 |
In this paper, we present a novel nearest neighbor rule-based implementation of the structural risk minimization principle to address a generic classification problem. We propose a fast reference set thinning algorithm on the training data set similar to a support vector machine (SVM) approach. We then show that the nearest neighbor rule based on the reduced set implements the structural risk minimization principle, in a manner which does not involve selection of a convenient feature space. Simulation results on real data indicate that this method significantly reduces the computational cost of the conventional SVMs, and achieves a nearly comparable test error performance.