Bursting in the LMS algorithm

  • Authors:
  • M. Rupp

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 1995

Quantified Score

Hi-index 35.68

Visualization

Abstract

The least mean square (LMS) algorithm is known to converge in the mean and in the mean square. However, during short time periods, the error sequence can blow up and cause severe disturbances, especially for non-Gaussian processes. The paper discusses potential short time unstable behavior of the LMS algorithm for spherically invariant random processes (SIRP) like Gaussian, Laplacian, and K0. The result of this investigation is that the probability for bursting decreases with the step size. However, since a smaller step size also causes a slower convergence rate, one has to choose a tradeoff between convergence speed and the frequence of bursting