Brief paper: Maximum likelihood identification of noisy input-output models
Automatica (Journal of IFAC)
Overview of total least-squares methods
Signal Processing
Journal of Mathematical Imaging and Vision
Survey paper: Structured low-rank approximation and its applications
Automatica (Journal of IFAC)
Optimization Algorithms on Subspaces: Revisiting Missing Data Problem in Low-Rank Matrix
International Journal of Computer Vision
Rank Constraints for Homographies over Two Views: Revisiting the Rank Four Constraint
International Journal of Computer Vision
Error Analysis in Homography Estimation by First Order Approximation Tools: A General Technique
Journal of Mathematical Imaging and Vision
Generic weighted filtering of stochastic signals
IEEE Transactions on Signal Processing
Software for weighted structured low-rank approximation
Journal of Computational and Applied Mathematics
Hi-index | 35.69 |
The low-rank approximation problem is to approximate optimally, with respect to some norm, a matrix by one of the same dimension but smaller rank. It is known that under the Frobenius norm, the best low-rank approximation can be found by using the singular value decomposition (SVD). Although this is no longer true under weighted norms in general, it is demonstrated here that the weighted low-rank approximation problem can be solved by finding the subspace that minimizes a particular cost function. A number of advantages of this parameterization over the traditional parameterization are elucidated. Finding the minimizing subspace is equivalent to minimizing a cost function on the Grassmann manifold. A general framework for constructing optimization algorithms on manifolds is presented and it is shown that existing algorithms in the literature are special cases of this framework. Within this framework, two novel algorithms (a steepest descent algorithm and a Newton-like algorithm) are derived for solving the weighted low-rank approximation problem. They are compared with other algorithms for low-rank approximation as well as with other algorithms for minimizing a cost function on a Grassmann manifold.