Convergence Rates of General Regularization Methods for Statistical Inverse Problems and Applications

  • Authors:
  • N. Bissantz;T. Hohage;A. Munk;F. Ruymgaart

  • Affiliations:
  • -;-;-;-

  • Venue:
  • SIAM Journal on Numerical Analysis
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Previously, the convergence analysis for linear statistical inverse problems has mainly focused on spectral cut-off and Tikhonov-type estimators. Spectral cut-off estimators achieve minimax rates for a broad range of smoothness classes and operators, but their practical usefulness is limited by the fact that they require a complete spectral decomposition of the operator. Tikhonov estimators are simpler to compute but still involve the inversion of an operator and achieve minimax rates only in restricted smoothness classes. In this paper we introduce a unifying technique to study the mean square error of a large class of regularization methods (spectral methods) including the aforementioned estimators as well as many iterative methods, such as $\nu$-methods and the Landweber iteration. The latter estimators converge at the same rate as spectral cut-off but require only matrix-vector products. Our results are applied to various problems; in particular we obtain precise convergence rates for satellite gradiometry, $L^2$-boosting, and errors in variable problems.