Robust least-squares estimation with a relative entropy constraint

  • Authors:
  • B. C. Levy;R. Nikoukhah

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., California Univ., Davis, CA, USA;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.96

Visualization

Abstract

Given a nominal statistical model, we consider the minimax estimation problem consisting of finding the best least-squares estimator for the least favorable statistical model within a neighborhood of the nominal model. The neighborhood is formed by placing a bound on the Kullback-Leibler (KL) divergence between the actual and nominal models. For a Gaussian nominal model and a finite observations interval, or for a stationary Gaussian process over an infinite interval, the usual noncausal Wiener filter remains optimal. However, the worst case performance of the filter is affected by the size of the neighborhood representing the model uncertainty. On the other hand, standard causal least-squares estimators are not optimal, and a characterization is provided for the causal estimator and the corresponding least favorable model. The causal estimator takes the form of a risk-sensitive estimator with an appropriately selected risk sensitivity coefficient.