The Nearest Neighbor and the Bayes Error Rates
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Theory for Multiresolution Signal Decomposition: The Wavelet Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Algebraic feature extraction of image for recognition
Pattern Recognition
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Transform Coding of Images
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Matrix-pattern-oriented Ho-Kashyap classifier with regularization learning
Pattern Recognition
Volume measure in 2DPCA-based face recognition
Pattern Recognition Letters
A discriminant analysis using composite features for classification problems
Pattern Recognition
New Least Squares Support Vector Machines Based on Matrix Patterns
Neural Processing Letters
A feature extraction approach based on typical samples and its application to face recognition
SPPR'07 Proceedings of the Fourth conference on IASTED International Conference: Signal Processing, Pattern Recognition, and Applications
Matrix-pattern-oriented least squares support vector classifier with AdaBoost
Pattern Recognition Letters
Two-dimensional subspace classifiers for face recognition
Neurocomputing
Rough set theory with discriminant analysis in analyzing electricity loads
Expert Systems with Applications: An International Journal
A feature extraction method for use with bimodal biometrics
Pattern Recognition
A feature extraction approach based on typical samples and its application to face recognition
SPPRA '07 Proceedings of the Fourth IASTED International Conference on Signal Processing, Pattern Recognition, and Applications
Generalized low-rank approximations of matrices revisited
IEEE Transactions on Neural Networks
Artificial Intelligence Review
Texture descriptors for generic pattern classification problems
Expert Systems with Applications: An International Journal
Facial images dimensionality reduction and recognition by means of 2DKLT
Machine Graphics & Vision International Journal
A novel multi-view learning developed from single-view patterns
Pattern Recognition
Matrix pattern based minimum within-class scatter support vector machines
Applied Soft Computing
An improvement to matrix-based LDA
AICI'11 Proceedings of the Third international conference on Artificial intelligence and computational intelligence - Volume Part III
Matrix representation in pattern classification
Expert Systems with Applications: An International Journal
Palmprint verification using GridPCA for Gabor features
Proceedings of the Second Symposium on Information and Communication Technology
Comparison of novel dimension reduction methods in face verification
ICIAR'06 Proceedings of the Third international conference on Image Analysis and Recognition - Volume Part II
Spatial approach to pose variations in face verification
IDA'05 Proceedings of the 6th international conference on Advances in Intelligent Data Analysis
Comparing and combining spatial dimension reduction methods in face verification
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
A probabilistic model for image representation via multiple patterns
Pattern Recognition
Comparison of tensor unfolding variants for 2DPCA-based color facial portraits recognition
ICCVG'12 Proceedings of the 2012 international conference on Computer Vision and Graphics
Three-fold structured classifier design based on matrix pattern
Pattern Recognition
Hi-index | 0.10 |
Principle component analysis (PCA) and Fisher linear discriminant analysis (FLDA), as two popular feature extraction approaches in Pattern recognition and data analysis, extract so-needed features directly based on vector patterns, i.e., before applying them, any non-vector pattern such as an image is first vectorized into a vector pattern by some technique like concatenation. However, such a vectorization has been proved not to be beneficial for image recognition due to consequences of both the algebraic feature extraction approach and 2DPCA. In this paper, inspired by the above two approaches, we try an opposite direction to extract features for any vector pattern by first matrixizing it into a matrix pattern and then applying the matrixized versions of PCA and FLDA, MatPCA and MatFLDA, to the pattern. MatFLDA uses, in essence, the same principle as the algebraic feature extraction approach and is constructed in terms of similar objective function to FLDA while MatPCA uses a minimization of the reconstructed error for the training samples like PCA to obtain a set of projection vectors, which is somewhat different derivation from 2DPCA despite of equivalence. Finally experiments on 10 publicly obtainable datasets show that both MatPCA and MatFLDA gain performance improvement in different degrees, respectively, on 7 and 5 datasets and at the same time, the computational burden of extracting features is largely reduced. In addition, it is noteworthy that the proposed approaches are still linear and the promotion of classification accuracy does not result from commonly-used non-linearization for the original linear approaches but from the simple matrixization. Furthermore, another prominent merit of matrixizing FLDA is that it can naturally break down the notorious rank limitation, that is, the number of discriminating vectors able to be found is bounded by C-1 for C class problem, and at the same time no additional computational cost is introduced.