Regularization in Regression with Bounded Noise: A Chebyshev Center Approach

  • Authors:
  • Amir Beck;Yonina C. Eldar

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Matrix Analysis and Applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

We consider the problem of estimating a vector ${\bf z}$ in the regression model $\mbox{$\mathcal{B}$} = {\bf A} {\bf z} + {\bf w}$, where ${\bf w}$ is an unknown but bounded noise. As in many regularization schemes, we assume that an upper bound on the norm of ${\bf z}$ is available. To estimate ${\bf z}$ we propose a relaxation of the Chebyshev center, which is the vector that minimizes the worst-case estimation error over all feasible vectors ${\bf z}$. Relying on recent results regarding strong duality of nonconvex quadratic optimization problems with two quadratic constraints, we prove that in the complex domain our approach leads to the exact Chebyshev center. In the real domain, this strategy results in a “pretty good” approximation of the true Chebyshev center. As we show, our estimate can be viewed as a Tikhonov regularization with a special choice of parameter that can be found efficiently by solving a convex optimization problem with two variables or a semidefinite program with three variables, regardless of the problem size. When the norm constraint on ${\bf z}$ is a Euclidean one, the problem reduces to a single-variable convex minimization problem. We then demonstrate via numerical examples that our estimator can outperform other conventional methods, such as least-squares and regularized least-squares, with respect to the estimation error. Finally, we extend our methodology to other feasible parameter sets, showing that the total least-squares (TLS) and regularized TLS can be obtained as special cases of our general approach.