Lower Bounds on the Mean-Squared Error of Low-Rank Matrix Reconstruction

  • Authors:
  • Gongguo Tang;Arye Nehorai

  • Affiliations:
  • Preston M. Green Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis,;Preston M. Green Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis,

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2011

Quantified Score

Hi-index 35.68

Visualization

Abstract

We investigate the behavior of the mean-square error (MSE) of low-rank matrix reconstruction and its special case, matrix completion. We first derive the constrained Cramér–Rao bound (CRB) on the MSE matrix of any locally unbiased estimator, and then analyze the behavior of the constrained CRB when a subset of entries of the underlying matrix is randomly observed. We design an alternating minimization procedure to compute the maximum likelihood estimator (MLE) for the low-rank matrix, and demonstrate through numerical simulations that the performance of the MLE approaches the constrained CRB when the signal-to-noise ratio is high. Applying a Chapman–Robbins type Barankin bound allows us to derive lower bounds on the worst-case scalar MSE. We demonstrate that the worst-case scalar MSE is infinite even if the model is identifiable. However, the infinite scalar MSE is achieved only on a set of low-rank matrices with measure zero. We discuss the implications of these bounds and compare them with the empirical performance of the matrix LASSO estimator and the existing bounds in the literature.