Two-dimensional subspace classifiers for face recognition
Neurocomputing
Robust classifiers for data reduced via random projections
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Expert Systems with Applications: An International Journal
Improved kernel common vector method for face recognition varying in background conditions
CompIMAGE'10 Proceedings of the Second international conference on Computational Modeling of Objects Represented in Images
Reducing features from pejibaye palm DNA marker for an efficient classification
NOLISP'09 Proceedings of the 2009 international conference on Advances in Nonlinear Speech Processing
Feature Extraction Using a Complete Kernel Extension of Supervised Graph Embedding
Neural Processing Letters
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
The common vector (CV) method is a linear subspace classifier method which allows one to discriminate between classes of data sets, such as those arising in image and word recognition. This method utilizes subspaces that represent classes during classification. Each subspace is modeled such that common features of all samples in the corresponding class are extracted. To accomplish this goal, the method eliminates features that are in the direction of the eigenvectors corresponding to the nonzero eigenvalues of the covariance matrix of each class. In this paper, we introduce a variation of the CV method, which will be referred to as the modified CV (MCV) method. Then, a novel approach is proposed to apply the MCV method in a nonlinearly mapped higher dimensional feature space. In this approach, all samples are mapped into a higher dimensional feature space using a kernel mapping function, and then, the MCV method is applied in the mapped space. Under certain conditions, each class gives rise to a unique CV, and the method guarantees a 100% recognition rate with respect to the training set data. Moreover, experiments with several test cases also show that the generalization performance of the proposed kernel method is comparable to the generalization performances of other linear subspace classifier methods as well as the kernel-based nonlinear subspace method. While both the MCV method and its kernel counterpart did not outperform the support vector machine (SVM) classifier in most of the reported experiments, the application of our proposed methods is simpler than that of the multiclass SVM classifier. In addition, it is not necessary to adjust any parameters in our approach.