James-Stein state filtering algorithms

  • Authors:
  • J.H. Manton;V. Krishnamurty;H.V. Poor

  • Affiliations:
  • Dept. of Electr. & Electron. Eng., Melbourne Univ., Parkville, Vic.;-;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 1998

Quantified Score

Hi-index 35.68

Visualization

Abstract

In 1961, James and Stein discovered a remarkable estimator that dominates the maximum-likelihood estimate of the mean of a p-variate normal distribution, provided the dimension p is greater than two. This paper extends the James-Stein estimator and highlights the benefits of applying these extensions to adaptive signal processing problems. The main contribution of this paper is the derivation of the James-Stein state filter (JSSF), which is a robust version of the Kalman filter. The JSSF is designed for situations where the parameters of the state-space evolution model are not known with any certainty. In deriving the JSSF, we derive several other results. We first derive a James-Stein estimator for estimating the regression parameter in a linear regression. A recursive implementation, which we call the James-Stein recursive least squares (JS-RLS) algorithm, is derived. The resulting estimate, although biased, has a smaller mean-square error than the traditional RLS algorithm. Finally, several heuristic algorithms are presented, including a James-Stein version of the Yule-Walker equations for AR parameter estimation