Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Discriminant Analysis of Principal Components for Face Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Kernel Eigenfaces vs. Kernel Fisherfaces: Face Recognition Using Kernel Methods
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Parallel Image Matrix Compression for Face Recognition
MMM '05 Proceedings of the 11th International Multimedia Modelling Conference
Iterative Kernel Principal Component Analysis for Image Modeling
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Hi-index | 0.00 |
The paper proposes a two-phase algorithm using 2DPCA and Gram-Schmidt Orthogonalization Procedure for better representation of face images with reduced dimension. While minimizing the within-class scatter, maximization of the total scatter is taken into account. The proposed method obtains the covariance matrix as in 2DPCA, and applies eigenvalue-eigenvector decomposition to this covariance matrix. Feature extraction is achieved using only d eigenvectors corresponding to largest d eigenvalues. The algorithm computes orthonormal bases by applying Gram-Schmidt Orthogonalization Procedure. Using these orthonormal bases, a common feature vector is calculated for each space in a class. A common feature matrix, which is used for image recognition, is then obtained for each class by gathering d common feature vectors of this class in a matrix form. Ar-Face database is used for experimental study. The proposed method produced better recognition rates compared to Eigenface, Fisherface and 2DPCA.