Uniformly Improving the CramÉr-Rao Bound and Maximum-Likelihood Estimation

  • Authors:
  • Y.C. Eldar

  • Affiliations:
  • Dept. of Electr. Eng., Israel Inst. of Technol., Haifa

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2006

Quantified Score

Hi-index 35.75

Visualization

Abstract

An important aspect of estimation theory is characterizing the best achievable performance in a given estimation problem, as well as determining estimators that achieve the optimal performance. The traditional Cramer-Rao type bounds provide benchmarks on the variance of any estimator of a deterministic parameter vector under suitable regularity conditions, while requiring a-priori specification of a desired bias gradient. In applications, it is often not clear how to choose the required bias. A direct measure of the estimation error that takes both the variance and the bias into account is the mean squared error (MSE), which is the sum of the variance and the squared-norm of the bias. Here, we develop bounds on the MSE in estimating a deterministic parameter vector x0 over all bias vectors that are linear in x0, which includes the traditional unbiased estimation as a special case. In some settings, it is possible to minimize the MSE over all linear bias vectors. More generally, direct minimization is not possible since the optimal solution depends on the unknown x0. Nonetheless, we show that in many cases, we can find bias vectors that result in an MSE bound that is smaller than the Cramer-Rao lower bound (CRLB) for all values of x0. Furthermore, we explicitly construct estimators that achieve these bounds in cases where an efficient estimator exists, by performing a simple linear transformation on the standard maximum likelihood (ML) estimator. This leads to estimators that result in a smaller MSE than the ML approach for all possible values of x0