Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Convergence of a block coordinate descent method for nondifferentiable minimization
Journal of Optimization Theory and Applications
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Convex Optimization
Hierarchic Bayesian models for kernel learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Sparse approximations in signal and image processing
Signal Processing - Sparse approximations in signal and image processing
Algorithms for simultaneous sparse approximation: part I: Greedy pursuit
Signal Processing - Sparse approximations in signal and image processing
Algorithms for simultaneous sparse approximation: part II: Convex relaxation
Signal Processing - Sparse approximations in signal and image processing
On the use of sparse signal decomposition in the analysis of multi-channel surface electromyograms
Signal Processing - Sparse approximations in signal and image processing
Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
SIAM Journal on Numerical Analysis
IEEE Transactions on Signal Processing
Robust recovery of signals from a structured union of subspaces
IEEE Transactions on Information Theory
Recovering sparse signals with a certain family of nonconvex penalties and DC programming
IEEE Transactions on Signal Processing
Machine Learning
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
Model-based compressive sensing for signal ensembles
Allerton'09 Proceedings of the 47th annual Allerton conference on Communication, control, and computing
Block-sparse signals: uncertainty relations and efficient recovery
IEEE Transactions on Signal Processing
Model-based compressive sensing
IEEE Transactions on Information Theory
A sparse signal reconstruction perspective for source localization with sensor arrays
IEEE Transactions on Signal Processing - Part II
An Empirical Bayesian Strategy for Solving the Simultaneous Sparse Approximation Problem
IEEE Transactions on Signal Processing - Part II
Subset selection in noise based on diversity measure minimization
IEEE Transactions on Signal Processing
Matching pursuits with time-frequency dictionaries
IEEE Transactions on Signal Processing
Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors
IEEE Transactions on Signal Processing - Part I
Sparse solutions to linear inverse problems with multiple measurement vectors
IEEE Transactions on Signal Processing
Why Simple Shrinkage Is Still Relevant for Redundant Representations?
IEEE Transactions on Information Theory
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
IEEE Transactions on Information Theory
Majorization–Minimization Algorithms for Wavelet-Based Image Restoration
IEEE Transactions on Image Processing
Compressed Sensing via Dimension Spread in Dimension-Restricted Systems
Wireless Personal Communications: An International Journal
Efficient feedback scheme based on compressed sensing in MIMO wireless networks
Computers and Electrical Engineering
One-shot learning gesture recognition from RGB-D data using bag of features
The Journal of Machine Learning Research
Hi-index | 0.08 |
In this paper, we survey and compare different algorithms that, given an overcomplete dictionary of elementary functions, solve the problem of simultaneous sparse signal approximation, with common sparsity profile induced by a @?"p-@?"q mixed-norm. Such a problem is also known in the statistical learning community as the group lasso problem. We have gathered and detailed different algorithmic results concerning these two equivalent approximation problems. We have also enriched the discussion by providing relations between several algorithms. Experimental comparisons of the detailed algorithms have also been carried out. The main lesson learned from these experiments is that depending on the performance measure, greedy approaches and iterative reweighted algorithms are the most efficient algorithms either in term of computational complexities, sparsity recovery or mean-square error.