Overview of total least-squares methods

  • Authors:
  • Ivan Markovsky;Sabine Van Huffel

  • Affiliations:
  • School of Electronics and Computer Science, University of Southampton, SO17 1BJ, UK;Katholieke Universiteit Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10 B-3001 Leuven, Belgium

  • Venue:
  • Signal Processing
  • Year:
  • 2007

Quantified Score

Hi-index 0.09

Visualization

Abstract

We review the development and extensions of the classical total least-squares method and describe algorithms for its generalization to weighted and structured approximation problems. In the generic case, the classical total least-squares problem has a unique solution, which is given in analytic form in terms of the singular value decomposition of the data matrix. The weighted and structured total least-squares problems have no such analytic solution and are currently solved numerically by local optimization methods. We explain how special structure of the weight matrix and the data matrix can be exploited for efficient cost function and first derivative computation. This allows to obtain computationally efficient solution methods. The total least-squares family of methods has a wide range of applications in system theory, signal processing, and computer algebra. We describe the applications for deconvolution, linear prediction, and errors-in-variables system identification.