Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Multisurface Proximal Support Vector Machine Classification via Generalized Eigenvalues
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fuzzy multi-category proximal support vector classification via generalized eigenvalues
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Twin Support Vector Machines for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Regularized locality preserving indexing via spectral regression
Proceedings of the sixteenth ACM conference on Conference on information and knowledge management
A classification method based on generalized eigenvalue problems
Optimization Methods & Software - Systems Analysis, Optimization and Data Mining in Biomedicine
Nonparallel plane proximal classifier
Signal Processing
Least squares twin support vector machines for pattern classification
Expert Systems with Applications: An International Journal
A least squares formulation for a class of generalized eigenvalue problems in machine learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Locality sensitive discriminant analysis
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Proximal support vector machine using local information
Neurocomputing
Fuzzy proximal support vector classification via generalized eigenvalues
PReMI'05 Proceedings of the First international conference on Pattern Recognition and Machine Intelligence
Probabilistic outputs for twin support vector machines
Knowledge-Based Systems
Hi-index | 0.01 |
Multisurface proximal support vector machine via generalized eigenvalues (GEPSVM), being an effective classification tool for supervised learning, tries to seek two nonparallel planes that are determined by solving two generalized eigenvalue problems (GEPs). The GEPs may lead to an instable classification performance, due to matrix singularity. Proximal support vector machine using local information (LIPSVM), as a variant of GEPSVM, attempts to avoid the above shortcoming through adopting a similar formulation to the Maximum Margin Criterion (MMC). The solution to an LIPSVM follows directly from solving two standard eigenvalue problems. Actually, an LIPSVM can be viewed as a reduced algorithm, because it uses the selectively generated points to train the classifier. A major advantage of an LIPSVM is that it is resistant to outliers. In this paper, following the geometric intuition of an LIPSVM, a novel multi-plane learning approach called Localized Twin SVM via Convex Minimization (LCTSVM) is proposed. This approach determines two nonparallel planes by solving two newly formed SVM-type problems. In addition to keeping the superior characteristics of an LIPSVM, an LCTSVM still has its additional edges: (1) it has similar or better classification capability compared to LIPSVM, TWSVM and LSTSVM; (2) each plane is generated from a quadratic programming problem (QPP) instead of a special convex difference optimization arising from an LIPSVM; (3) the solution can be reduced to solving two systems of linear equations, resulting in considerably lesser computational cost; and (4) it can find the global minimum. Experiments carried out on both toy and real-world problems disclose the effectiveness of an LCTSVM.