An Optimal Transformation for Discriminant and Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Using Discriminant Eigenfeatures for Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unified Subspace Analysis for Face Recognition
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
IEEE Transactions on Pattern Analysis and Machine Intelligence
Orthogonal locality preserving indexing
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
The Journal of Machine Learning Research
Class-Incremental Generalized Discriminant Analysis
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
An Optimal Set of Discriminant Vectors
IEEE Transactions on Computers
Dual-space linear discriminant analysis for face recognition
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Generalizing discriminant analysis using the generalized singular value decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Orthogonal Laplacianfaces for Face Recognition
IEEE Transactions on Image Processing
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Foley-Sammon optimal discriminant vectors using kernel approach
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Discriminant analysis plays an important role in statistical pattern recognition. A popular method is the Foley-Sammon optimal discriminant vectors (FSODVs) method, which aims to find an optimal set of discriminant vectors that maximize the Fisher discriminant criterion under the orthogonal constraint. The FSODVs method outperforms the classic Fisher linear discriminant analysis (FLDA) method in the sense that it can solve more discriminant vectors for recognition. Kernel Foley-Sammon optimal discriminant vectors (KFSODVs) is a nonlinear extension of FSODVs via the kernel trick. However, the current KFSODVs algorithm may suffer from the heavy computation problem since it involves computing the inverse of matrices when solving each discriminant vector, resulting in a cubic complexity for each discriminant vector. This is costly when the number of discriminant vectors to be computed is large. In this paper, we propose a fast algorithm for solving the KFSODVs, which is based on rank-one update (ROU) of the eigensytems. It only requires a square complexity for each discriminant vector. Moreover, we also generalize our method to efficiently solve a family of optimally constrained generalized Rayleigh quotient (OCGRQ) problems which include many existing dimensionality reduction techniques. We conduct extensive experiments on several real data sets to demonstrate the effectiveness of the proposed algorithms.