Parallel and distributed computation: numerical methods
Parallel and distributed computation: numerical methods
Mathematical Programming: Series A and B
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Smooth minimization of non-smooth functions
Mathematical Programming: Series A and B
Efficient projections onto the l1-ball for learning in high dimensions
Proceedings of the 25th international conference on Machine learning
The Group-Lasso for generalized linear models: uniqueness of solutions and efficient algorithms
Proceedings of the 25th international conference on Machine learning
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Learning with structured sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Sparse reconstruction by separable approximation
IEEE Transactions on Signal Processing
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
The Split Bregman Method for L1-Regularized Problems
SIAM Journal on Imaging Sciences
Deblurring Poissonian images by split Bregman techniques
Journal of Visual Communication and Image Representation
Convex and Network Flow Optimization for Structured Sparsity
The Journal of Machine Learning Research
Structured Variable Selection with Sparsity-Inducing Norms
The Journal of Machine Learning Research
Foundations and Trends® in Machine Learning
Expository & survey paper: Multiplier methods: A survey
Automatica (Journal of IFAC)
Review of recent development: An O( n) algorithm for quadratic knapsack problems
Operations Research Letters
IEEE Transactions on Image Processing
Active learning via neighborhood reconstruction
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
We consider a class of sparse learning problems in high dimensional feature space regularized by a structured sparsity-inducing norm that incorporates prior knowledge of the group structure of the features. Such problems often pose a considerable challenge to optimization algorithms due to the non-smoothness and non-separability of the regularization term. In this paper, we focus on two commonly adopted sparsity-inducing regularization terms, the overlapping Group Lasso penalty l1/l2-norm and the l1/l∞-norm. We propose a unified framework based on the augmented Lagrangian method, under which problems with both types of regularization and their variants can be efficiently solved. As one of the core building-blocks of this framework, we develop new algorithms using a partial-linearization/splitting technique and prove that the accelerated versions of these algorithms require O(1/√ε) iterations to obtain an ε-optimal solution. We compare the performance of these algorithms against that of the alternating direction augmented Lagrangian and FISTA methods on a collection of data sets and apply them to two real-world problems to compare the relative merits of the two norms.