The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format

  • Authors:
  • Sebastian Holtz;Thorsten Rohwedder;Reinhold Schneider

  • Affiliations:
  • sholtz@math.tu-berlin.de and rohwedde@math.tu-berlin.de and schneidr@math.tu-berlin.de;-;-

  • Venue:
  • SIAM Journal on Scientific Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Recent achievements in the field of tensor product approximation provide promising new formats for the representation of tensors in form of tree tensor networks. In contrast to the canonical $r$-term representation (CANDECOMP, PARAFAC), these new formats provide stable representations, while the amount of required data is only slightly larger. The tensor train (TT) format [SIAM J. Sci. Comput., 33 (2011), pp. 2295-2317], a simple special case of the hierarchical Tucker format [J. Fourier Anal. Appl., 5 (2009), p. 706], is a useful prototype for practical low-rank tensor representation. In this article, we show how optimization tasks can be treated in the TT format by a generalization of the well-known alternating least squares (ALS) algorithm and by a modified approach (MALS) that enables dynamical rank adaptation. A formulation of the component equations in terms of so-called retraction operators helps to show that many structural properties of the original problems transfer to the micro-iterations, giving what is to our knowledge the first stable generic algorithm for the treatment of optimization tasks in the tensor format. For the examples of linear equations and eigenvalue equations, we derive concrete working equations for the micro-iteration steps; numerical examples confirm the theoretical results concerning the stability of the TT decomposition and of ALS and MALS but also show that in some cases, high TT ranks are required during the iterative approximation of low-rank tensors, showing some potential of improvement.