Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
Solving the Small Sample Size Problem of LDA
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 3 - Volume 3
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Optimal Set of Discriminant Vectors
IEEE Transactions on Computers
Journal of Cognitive Neuroscience
Robust coding schemes for indexing and retrieval from large face databases
IEEE Transactions on Image Processing
A shape- and texture-based enhanced Fisher classifier for face recognition
IEEE Transactions on Image Processing
Hi-index | 0.00 |
Linear Discriminant Analysis (LDA) technique is an important and well-developed area of image recognition and to date many linear discrimination methods have been put forward. Despite these efforts, there persist in the traditional LDA some weaknesses. In this paper, we propose a new LDA-based method called Block LDA (BLDA) that can outperform the traditional Linear Dicriminant Analysis (LDA) methods. As opposed to conventional LDA, BLDA is based on 2D matrices rather than 1D vectors. That is, we firstly divides the original image into blocks. Then, we transform the image into a vector of blocks. By using row vector to represent each block, we can get the new matrix which is the representation of the image. Finally LDA can be applied directly on these matrices. In contrast to the between-class and within-class covariance matrices of LDA, the size of the these covariance matrices using BLDA is much smaller. As a result, BLDA has three important advantages over LDA. First, it is easier to evaluate the between-class and within-class covariance matrices accurately. Second, less time is required to determine the corresponding eigenvectors. And finally, block size could be changed to get the best results. Experiment results show our method achieves better performance in comparison with the other methods.