Mixtures of probabilistic principal component analyzers
Neural Computation
A Multilinear Singular Value Decomposition
SIAM Journal on Matrix Analysis and Applications
On the Best Rank-1 and Rank-(R1,R2,. . .,RN) Approximation of Higher-Order Tensors
SIAM Journal on Matrix Analysis and Applications
Quantum computation and quantum information
Quantum computation and quantum information
Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem
SIAM Journal on Matrix Analysis and Applications
Optimization Algorithms on Matrix Manifolds
Optimization Algorithms on Matrix Manifolds
SIAM Journal on Matrix Analysis and Applications
Clustering appearances of objects under varying illumination conditions
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors
SIAM Journal on Scientific Computing
SIAM Journal on Matrix Analysis and Applications
SIAM Journal on Matrix Analysis and Applications
SIAM Journal on Matrix Analysis and Applications
Hi-index | 0.00 |
We introduce a generalized Rayleigh-quotient $\rho_A$ on the direct product of Grassmannians $\mathrm{Gr}({\bf m},{\bf n})$ enabling a unified approach to well-known optimization tasks from different areas of numerical linear algebra, such as best low-rank approximations of tensors (data compression), geometric measures of entanglement (quantum computing), and subspace clustering (image processing). We compute the Riemannian gradient of $\rho_A$, characterize its critical points, and prove that they are generically nondegenerated. Moreover, we derive an explicit necessary condition for the nondegeneracy of the Hessian. Finally, we present two intrinsic methods for optimizing $\rho_A$—a Newton-like and a conjugated gradient—and compare our algorithms tailored to the above-mentioned applications with established ones from the literature.