Low-Rank Matrix Approximation Using the Lanczos Bidiagonalization Process with Applications
SIAM Journal on Scientific Computing
Real-Valued, Low Rank, Circulant Approximation
SIAM Journal on Matrix Analysis and Applications
Kernel independent component analysis
The Journal of Machine Learning Research
Fast Monte Carlo Algorithms for Matrices II: Computing a Low-Rank Approximation to a Matrix
SIAM Journal on Computing
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Structured total least norm and approximate GCDs of inexact polynomials
Journal of Computational and Applied Mathematics
Survey paper: Structured low-rank approximation and its applications
Automatica (Journal of IFAC)
Low Rank Approximation: Algorithms, Implementation, Applications
Low Rank Approximation: Algorithms, Implementation, Applications
On the low-rank approximation by the pivoted Cholesky decomposition
Applied Numerical Mathematics
Hi-index | 7.29 |
In this paper, we consider the low rank approximation of the symmetric positive semidefinite matrix, which arises in machine learning, quantum chemistry and inverse problem. We first characterize the feasible set by X=YY^T,Y@?R^n^x^k, and then transform low rank approximation into an unconstrained optimization problem. Finally, we use the nonlinear conjugate gradient method with exact line search to compute the optimal low rank symmetric positive semidefinite approximation of the given matrix. Numerical examples show that the new method is feasible and effective.