On the spectral decomposition of Hermitian matrices modified by low rank perturbations
SIAM Journal on Matrix Analysis and Applications
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Jacobi--Davidson Style QR and QZ Algorithms for the Reduction of Matrix Pencils
SIAM Journal on Scientific Computing
SIAM Journal on Scientific Computing
An Inverse Free Preconditioned Krylov Subspace Method for Symmetric Generalized Eigenvalue Problems
SIAM Journal on Scientific Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using Uncorrelated Discriminant Analysis for Tissue Classification with Gene Expression Data
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Journal of Machine Learning Research
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Computational and Theoretical Analysis of Null Space and Orthogonal Linear Discriminant Analysis
The Journal of Machine Learning Research
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
General Tensor Discriminant Analysis and Gabor Features for Gait Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
SRDA: An Efficient Algorithm for Large-Scale Discriminant Analysis
IEEE Transactions on Knowledge and Data Engineering
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Probabilistic dyadic data analysis with local and global consistency
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
A least squares formulation for a class of generalized eigenvalue problems in machine learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Patch Alignment for Dimensionality Reduction
IEEE Transactions on Knowledge and Data Engineering
A Flexible and Efficient Algorithm for Regularized Fisher Discriminant Analysis
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Using Perturbed $QR$ Factorizations to Solve Linear Least-Squares Problems
SIAM Journal on Matrix Analysis and Applications
IEEE Transactions on Neural Networks
Least Square Incremental Linear Discriminant Analysis
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
A truncated-CG style method for symmetric generalized eigenvalue problems
Journal of Computational and Applied Mathematics
A scalable two-stage approach for a class of dimensionality reduction techniques
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Regularized Discriminant Analysis, Ridge Regression and Beyond
The Journal of Machine Learning Research
Uncorrelated trace ratio linear discriminant analysis for undersampled problems
Pattern Recognition Letters
Speed up kernel discriminant analysis
The VLDB Journal — The International Journal on Very Large Data Bases
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel discriminant analysis for regression problems
Pattern Recognition
Ensemble Manifold Regularization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalizing discriminant analysis using the generalized singular value decomposition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Efficient and robust feature extraction by maximum margin criterion
IEEE Transactions on Neural Networks
Generalized Linear Discriminant Analysis: A Unified Framework and Efficient Model Selection
IEEE Transactions on Neural Networks
Orthogonal discriminant vector for face recognition across pose
Pattern Recognition
Hi-index | 0.01 |
Linear Discriminant Analysis (LDA) is one of the most popular approaches for supervised feature extraction and dimension reduction. However, the computation of LDA involves dense matrices eigendecomposition, which is time-consuming for large-scale problems. In this paper, we present a novel algorithm called Rayleigh-Ritz Discriminant Analysis (RRDA) for efficiently solving LDA. While much of the prior research focus on transforming the generalized eigenvalue problem into a least squares formulation, our method is instead based on the well-established Rayleigh-Ritz framework for general eigenvalue problems and seeks to directly solve the generalized eigenvalue problem of LDA. By exploiting the structures in LDA problems, we are able to design customized and highly efficient subspace expansion and extraction strategy for the Rayleigh-Ritz procedure. To reduce the storage requirement and computational complexity of RRDA for high dimensional, low sample size data, we also establish an equivalent reduced model of RRDA. Practical implementations and the convergence result of our method are also discussed. Our experimental results on several real world data sets indicate the performance of the proposed algorithm.