The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Support vector machine active learning for image retrieval
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Learning large margin classifiers locally and globally
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Structured large margin machines: sensitive to data distributions
Machine Learning
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
The generalization error of the symmetric and scaled support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support Vector Machines (SVMs) are efficient tools, which have been widely studied and used in many fields. However, original SVM (C-SVM) only focuses on the scatter between classes, but neglects the global information about the data which are also vital for an optimal classifier. Therefore, C-SVM loses some robustness. To solve this problem, one approach is to translate (i.e., to move without rotation or change of shape) the hyperplane according to the global characteristics of the data. However, parts of existing work using this approach are based on specific distribution assumption (S-SVM), while the rest fail to utilize the global information (GS-SVM). In this paper, we propose a simple but efficient method based on weighted scatter degree (WSD-SVM) to embed the global information into GS-SVM without any distribution assumptions. A comparison of WSD-SVM, C-SVM and GS-SVM is conducted, and the results on several data sets show the advantages of WSD-SVM.