Multicategory Classification by Support Vector Machines
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part I
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper presents least square formulations for constructing a pair-wise linear and nonlinear classification decision functions. The formulations are based on the KKT system obtained from the optimality conditions of the PAMSVM problem. This derivation can be considered a variant of the Suykens and Vandewalle's least square multi-class SVMs, with the notable difference been the equality constraints used in their problem formulation and the encoded classes (labels) represented by multiple streams for the system output. The least square formulation will be of two types namely the pairwise least square multi-classification support vector machine (PALS-MSVM) and piecewise least square multi-classification support vector machine (PILS-MSVM). A piece-wise MSVM formulation will be selected from the existing literature and its optimality conditions will be written out and expressed as a least square problem. The structures of both LS problems are essentially the same with the only difference been the matrix which has the dataset information. Since the Mercer conditions are applicable, kernels can be implemented as appropriate for nonlinear classification problems.