Adaptive iterative thresholding algorithms for magnetoencephalography (MEG)
Journal of Computational and Applied Mathematics
Elastic-net regularization in learning theory
Journal of Complexity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Compressive Algorithms--Adaptive Solutions of PDEs and Variational Problems
Proceedings of the 13th IMA International Conference on Mathematics of Surfaces XIII
Dictionary learning for sparse approximations with the majorization method
IEEE Transactions on Signal Processing
Compressive-projection principal component analysis
IEEE Transactions on Image Processing
Average case analysis of multichannel sparse recovery using convex relaxation
IEEE Transactions on Information Theory
Enhancement of coupled multichannel images using sparsity constraints
IEEE Transactions on Image Processing
Performance analysis for sparse support recovery
IEEE Transactions on Information Theory
SIAM Journal on Scientific Computing
Towards multi-semantic image annotation with graph regularized exclusive group lasso
MM '11 Proceedings of the 19th ACM international conference on Multimedia
An iterative algorithm with joint sparsity constraints for magnetic tomography
MMCS'08 Proceedings of the 7th international conference on Mathematical Methods for Curves and Surfaces
Neuroelectric current localization from combined EEG/MEG data
Proceedings of the 7th international conference on Curves and Surfaces
Directional Sparsity in Optimal Control of Partial Differential Equations
SIAM Journal on Control and Optimization
Computational Optimization and Applications
Proximity algorithms for the L1/TV image denoising model
Advances in Computational Mathematics
Multi class learning with individual sparsity
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.13 |
Vector-valued data appearing in concrete applications often possess sparse expansions with respect to a preassigned frame for each vector component individually. Additionally, different components may also exhibit common sparsity patterns. Recently, there were introduced sparsity measures that take into account such joint sparsity patterns, promoting coupling of nonvanishing components. These measures are typically constructed as weighted $\ell_1$ norms of componentwise $\ell_q$ norms of frame coefficients. We show how to compute solutions of linear inverse problems with such joint sparsity regularization constraints by fast thresholded Landweber algorithms. Next we discuss the adaptive choice of suitable weights appearing in the definition of sparsity measures. The weights are interpreted as indicators of the sparsity pattern and are iteratively updated after each new application of the thresholded Landweber algorithm. The resulting two-step algorithm is interpreted as a double-minimization scheme for a suitable target functional. We show its $\ell_2$-norm convergence. An implementable version of the algorithm is also formulated, and its norm convergence is proven. Numerical experiments in color image restoration are presented.