H∞ optimality of the LMS algorithm

  • Authors:
  • B. Hassibi;A.H. Sayed;T. Kailath

  • Affiliations:
  • Inf. Syst. Lab., Stanford Univ., CA;-;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 1996

Quantified Score

Hi-index 35.69

Visualization

Abstract

We show that the celebrated least-mean squares (LMS) adaptive algorithm is H∞ optimal. The LMS algorithm has been long regarded as an approximate solution to either a stochastic or a deterministic least-squares problem, and it essentially amounts to updating the weight vector estimates along the direction of the instantaneous gradient of a quadratic cost function. We show that the LMS can be regarded as the exact solution to a minimization problem in its own right. Namely, we establish that it is a minimax filter: it minimizes the maximum energy gain from the disturbances to the predicted errors, whereas the closely related so-called normalized LMS algorithm minimizes the maximum energy gain from the disturbances to the filtered errors. Moreover, since these algorithms are central H∞ filters, they minimize a certain exponential cost function and are thus also risk-sensitive optimal. We discuss the various implications of these results and show how they provide theoretical justification for the widely observed excellent robustness properties of the LMS filter