Using Discriminant Eigenfeatures for Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Face recognition: A literature survey
ACM Computing Surveys (CSUR)
The common vector approach and its comparison with other subspace methods in case of sufficient data
Computer Speech and Language
2D and 3D face recognition: A survey
Pattern Recognition Letters
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
Two-dimensional subspace classifiers for face recognition
Neurocomputing
Linear Regression for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust regression for face recognition
Pattern Recognition
Hi-index | 0.01 |
The Traditional Linear Regression Classification (LRC) method fails when the number of data in the training set is greater than their dimensions. In this work, we proposed a new implementation of LRC to overcome this problem in the pattern recognition. The new form of LRC works even in the case of having low-dimensional excessive number of data. In order to explain the new form of LRC, the relation between the predictor and the correlation matrix of a class is shown first. Then for the derivation of LRC, the null space of the correlation matrix is generated by using the eigenvectors corresponding to the smallest eigenvalues. These eigenvectors are used to calculate the projection matrix in LRC. Also the equivalence of LRC and the method called Class-Featuring Information Compression (CLAFIC) is shown theoretically. TI Digit database and Multiple Feature dataset are used to illustrate the use of proposed improvement on LRC and CLAFIC.