Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Convex Optimization
Algorithms for simultaneous sparse approximation: part II: Convex relaxation
Signal Processing - Sparse approximations in signal and image processing
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
RLS-weighted Lasso for adaptive estimation of sparse signals
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
Theoretical Results on Sparse Representations of Multiple-Measurement Vectors
IEEE Transactions on Signal Processing
Sparse solutions to linear inverse problems with multiple measurement vectors
IEEE Transactions on Signal Processing
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Compressive system identification: Sequential methods and entropy bounds
Digital Signal Processing
Hi-index | 0.00 |
Compressed sensing (CS) lowers the number of measurements required for reconstruction and estimation of signals that are sparse when expanded over a proper basis. Traditional CS approaches deal with time-invariant sparse signals, meaning that, during the measurement process, the signal of interest does not exhibit variations. However, many signals encountered in practice are varying with time as the observation window increases (e.g., video imaging, where the signal is sparse and varies between different frames). The present paper develops CS algorithms for time-varying signals, based on the least-absolute shrinkage and selection operator (Lasso) that has been popular for sparse regression problems. The Lasso here is tailored for smoothing time-varying signals, which are modeled as vector valued discrete time series. Two algorithms are proposed: the Group-Fused Lasso, when the unknown signal support is time-invariant but signal samples are allowed to vary with time; and the Dynamic Lasso, for the general class of signals with time-varying amplitudes and support. Performance of these algorithms is compared with a sparsity-unaware Kalman smoother, a support-aware Kalman smoother, and the standard Lasso which does not account for time variations. The numerical results amply demonstrate the practical merits of the novel CS algorithms.