Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Where Are Linear Feature Extraction Methods Applicable?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selecting Principal Components in a Two-Stage LDA Algorithm
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Subclass Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Hi-index | 0.00 |
Recently, A. M. Martinez indicated that, the angle between the eigenvector corresponding to the largest eigenvalue of the inter-class covariance and the eigenvector corresponding to the largest eigenvalue of the intra-class covariance is more crucial to the performance of traditional linear discriminant methods, furthermore, if the two eigenvectors are parallel, the final results may be disputable. However, upon careful scrutiny on his assertion, we concluded that the angle between the two eigenvectors is less decisive to the performance, more over, the main drawback of traditional linear methods is the inter-class covariance cannot precisely reflect the discriminant information. Simply maximizing the inter-class covariance in the principle component space may induce the losing of adjacent class-pair's contribution. Therefore,we propose the Optimal Linear Discriminant Analysis(henceforth OLDA)method, which distributes equivalent authority for each class-pair by employing "discriminative power". Besides, we employ the gradient scheme to derive the feature vectors. Thirdly, to address the multimodal problem, the pre-clustering mechanism is adopted to ameliorate the nonlinear structure. We apply our method on a practical face database and a virtual database, the experimental results show the promise of our method.