Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Machine Learning
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Sparse least squares support vector training in the reduced empirical feature space
Pattern Analysis & Applications
The kernel orthogonal mutual subspace method and its application to 3D object recognition
ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part II
A framework for 3d object recognition using the kernel constrained mutual subspace method
ACCV'06 Proceedings of the 7th Asian conference on Computer Vision - Volume Part II
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, we discuss subspace based least squares support vector machines (SSLS-SVMs), in which an input vector is classified into the class with the maximum similarity. Namely, we define the similarity measure for each class by the weighted sum of vectors called dictionaries and optimize the weights so that the margin between classes is optimized. Because the similarity measure is defined for each class, the similarity measure associated with a data sam pie needs to be the largest among all the similarity measures. Introducing slack variables we define these constraints by equality constraints. Then the proposed SSLS-SVMs is similar to LS-SVMs by all-at-once form ulation. Because all-at-once form ulation is inefficient, we also propose SSLS-SVMs by one-against-all form ulation. We demonstrate the effectiveness of the proposed methods with the conventional method for two-class problems.