Linear algebra for tensor problems
Computing
Hierarchical Singular Value Decomposition of Tensors
SIAM Journal on Matrix Analysis and Applications
Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors
SIAM Journal on Scientific Computing
SIAM Journal on Matrix Analysis and Applications
A tensor decomposition approach to data compression and approximation of ND systems
Multidimensional Systems and Signal Processing
Online learning in the embedded manifold of low-rank matrices
The Journal of Machine Learning Research
A New Truncation Strategy for the Higher-Order Singular Value Decomposition
SIAM Journal on Scientific Computing
SIAM Journal on Matrix Analysis and Applications
Wedderburn Rank Reduction and Krylov Subspace Method for Tensor Approximation. Part 1: Tucker Case
SIAM Journal on Scientific Computing
SIAM Journal on Matrix Analysis and Applications
Hi-index | 0.00 |
We derive a Newton method for computing the best rank-$(r_1,r_2,r_3)$ approximation of a given $J\times K\times L$ tensor $\mathcal{A}$. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton's method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for matricizing a tensor, for contracted tensor products and some tensor-algebraic manipulations, which simplify the derivation of the Newton equations and enable straightforward algorithmic implementation. Experiments show a quadratic convergence rate for the Newton-Grassmann algorithm.