Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Sparse least squares support vector training in the reduced empirical feature space
Pattern Analysis & Applications
The kernel orthogonal mutual subspace method and its application to 3D object recognition
ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part II
A framework for 3d object recognition using the kernel constrained mutual subspace method
ACCV'06 Proceedings of the 7th Asian conference on Computer Vision - Volume Part II
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In subspace methods, the subspace associated with a class is represented by a small number of vectors called dictionaries and using the dictionaries the similarity measure is defined and an input is classified into the class with the highest similarity. Usually, each dictionary is given an equal weight. But if subspaces of different classes overlap, the similarity measures for the overlapping regions will not give useful information for classification. In this paper, we propose optimizing the weights for the dictionaries using the idea of support vector machines (SVMs). Namely, first we map the input space into the empirical feature space, perform kernel principal component analysis (KPCA) for each class, and define a similarity measure. Then considering that the similarity measure corresponds to the hyperplane, we formulate the optimization problem as maximizing the margin between the class associated with the dictionaries and the remaining classes. The optimization problem results in all-at-once formulation of linear SVMs. We demonstrate the effectiveness of the proposed method with that of the conventional methods for two-class problems.