Comparison of SPARLS and RLS algorithms for adaptive filtering
SARNOFF'09 Proceedings of the 32nd international conference on Sarnoff symposium
An adaptive greedy algorithm with application to nonlinear communications
IEEE Transactions on Signal Processing
SPARLS: the sparse RLS algorithm
IEEE Transactions on Signal Processing
Online adaptive estimation of sparse signals: where RLS meets the l1-norm
IEEE Transactions on Signal Processing
LVA/ICA'10 Proceedings of the 9th international conference on Latent variable analysis and signal separation
Adaptive algorithms for sparse system identification
Signal Processing
Hi-index | 0.01 |
We propose a new approach to adaptive system identification when the system model is sparse. The approach applies ℓ1 relaxation, common in compressive sensing, to improve the performance of LMS-type adaptive methods. This results in two new algorithms, the zero-attracting LMS (ZA-LMS) and the reweighted zero-attracting LMS (RZA-LMS). The ZA-LMS is derived via combining a ℓ1 norm penalty on the coefficients into the quadratic LMS cost function, which generates a zero attractor in the LMS iteration. The zero attractor promotes sparsity in taps during the filtering process, and therefore accelerates convergence when identifying sparse systems. We prove that the ZA-LMS can achieve lower mean square error than the standard LMS. To further improve the filtering performance, the RZA-LMS is developed using a reweighted zero attractor. The performance of the RZA-LMS is superior to that of the ZA-LMS numerically. Experiments demonstrate the advantages of the proposed filters in both convergence rate and steady-state behavior under sparsity assumptions on the true coefficient vector. The RZA-LMS is also shown to be robust when the number of non-zero taps increases.