Lower bounds on the mean square error derived from mixture of linear and non-linear transformations of the unbiasness definition

  • Authors:
  • Eric Chaumette;Alexandre Renaux;Pascal Larzabal

  • Affiliations:
  • ONERA - DEMR/TSI, The French Aerospace Lab, Chemin de la Hunière, F-91120 Palaiseau, France;Université Paris-Sud 11, L2S, Supelec, 3 rue Joliot Curie, F-91190 Gif-Sur-Yvette, France;SATIE, ENS Cachan, CNRS, UniverSud, 61 av President Wilson, F-94230, France

  • Venue:
  • ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.06

Visualization

Abstract

It is well known that in non-linear estimation problems the ML estimator exhibits a threshold effect, i.e. a rapid deterioration of estimation accuracy below a certain SNR or number of snapshots. This effect is caused by outliers and is not captured by standard tools such as the Cramér-Rao bound (CRB). The search of the SNR threshold value can be achieved with the help of approximations of the Barankin bound (BB) proposed by many authors. These approximations result from a linear transformation (discrete or integral) of the uniform unbiasness constraint introduced by Barankin. Nevertheless, non-linear transformations can be used as well for some class of p.d.f. including the Gaussian case. The benefit is their combination with existing linear transformation to get tighter lower bounds improving the SNR threshold prediction.