A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Multisurface Proximal Support Vector Machine Classification via Generalized Eigenvalues
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neighborhood Property--Based Pattern Selection for Support Vector Machines
Neural Computation
A classification method based on generalized eigenvalue problems
Optimization Methods & Software - Systems Analysis, Optimization and Data Mining in Biomedicine
Consistency of support vector machines and other regularized kernel classifiers
IEEE Transactions on Information Theory
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
A regularization for the projection twin support vector machine
Knowledge-Based Systems
Hi-index | 0.11 |
Proximal support vector machine via generalized eigenvalues (GEPSVM), as a variant of SVM, is originally motivated to effectively classify XOR problems that are not linearly separable. Through analysis and experiments, it has been shown to be better than SVM in favor of reduction of time complexity. However, the major disadvantages of GEPSVM lie in two aspects: (1) some complex XOR problems cannot be effectively classified; (2) it may fail to get a stable solution due to the matrix singularity occurring. By defining a new principle, we propose an original algorithm, called multi-weight vector support vector machines (MVSVM). The proposed method not only keeps the superior characteristics of GEPSVM, but also has its additional edges: (1) it performs well on complex XOR datasets; (2) instead of generalized eigenvalue problems in GEPSVM, MVSVM solves two standard eigenvalue problems to avoid the matrix singularity of GEPSVM; (3) it has comparable or better generalization ability compared to SVM and GEPSVM; (4) it is the fastest among three algorithms. Experiments tried out on artificial and public datasets also indicate the effectiveness of MVSVM.