A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Combining support vector and mathematical programming methods for classification
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support vector machine constructs an optimal classification hyperplane by support vectors. While samples near the boundary are overlapped seriously, it not only increases the burden of computation but also decreases the generalization ability. An improved SVM: NNSVM algorithm was proposed to solve the above problems in literature [1]. NN-SVM just reserves or deletes a sample according to whether its nearest neighbor has same class label with itself or not. However, its generalization ability will be decreased by samples intermixed in another class. Therefore, in this paper, we present an improved NN-SVM algorithm: it prunes a sample according to its nearest neighbor's class label as well as distances between the sample and its k congener nearest neighbors. Experimental results show that the improved NN-SVM is better than NN-SVM in accuracy of classification and the total training and testing time is comparative to that of NN-SVM.