Sparse LMS for system identification

  • Authors:
  • Yilun Chen;Yuantao Gu;Alfred O. Hero

  • Affiliations:
  • Department of EECS, University of Michigan, Ann Arbor, 48109-2122, USA;Department of EE, Tsinghua University, Beijing 100084, China;Department of EECS, University of Michigan, Ann Arbor, 48109-2122, USA

  • Venue:
  • ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

We propose a new approach to adaptive system identification when the system model is sparse. The approach applies ℓ1 relaxation, common in compressive sensing, to improve the performance of LMS-type adaptive methods. This results in two new algorithms, the zero-attracting LMS (ZA-LMS) and the reweighted zero-attracting LMS (RZA-LMS). The ZA-LMS is derived via combining a ℓ1 norm penalty on the coefficients into the quadratic LMS cost function, which generates a zero attractor in the LMS iteration. The zero attractor promotes sparsity in taps during the filtering process, and therefore accelerates convergence when identifying sparse systems. We prove that the ZA-LMS can achieve lower mean square error than the standard LMS. To further improve the filtering performance, the RZA-LMS is developed using a reweighted zero attractor. The performance of the RZA-LMS is superior to that of the ZA-LMS numerically. Experiments demonstrate the advantages of the proposed filters in both convergence rate and steady-state behavior under sparsity assumptions on the true coefficient vector. The RZA-LMS is also shown to be robust when the number of non-zero taps increases.