Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Coding Facial Expressions with Gabor Wavelets
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization
ICML '06 Proceedings of the 23rd international conference on Machine learning
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-Class L2,1-Norm Support Vector Machine
ICDM '11 Proceedings of the 2011 IEEE 11th International Conference on Data Mining
Optimal exact least squares rank minimization
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
Hi-index | 0.00 |
The low-rank regression model has been studied and applied to capture the underlying classes/tasks correlation patterns, such that the regression/classification results can be enhanced. In this paper, we will prove that the low-rank regression model is equivalent to doing linear regression in the linear discriminant analysis (LDA) subspace. Our new theory reveals the learning mechanism of low-rank regression, and shows that the low-rank structures exacted from classes/tasks are connected to the LDA projection results. Thus, the low-rank regression efficiently works for the high-dimensional data. Moreover, we will propose new discriminant low-rank ridge regression and sparse low-rank regression methods. Both of them are equivalent to doing regularized regression in the regularized LDA subspace. These new regularized objectives provide better data mining results than existing low-rank regression in both theoretical and empirical validations. We evaluate our discriminant low-rank regression methods by six benchmark datasets. In all empirical results, our discriminant low-rank models consistently show better results than the corresponding full-rank methods.