Projected gradient methods for linearly constrained problems
Mathematical Programming: Series A and B
On the linear convergence of descent methods for convex essentially smooth minimization
SIAM Journal on Control and Optimization
Mathematical Programming: Series A and B
The Group-Lasso for generalized linear models: uniqueness of solutions and efficient algorithms
Proceedings of the 25th international conference on Machine learning
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
A coordinate gradient descent method for nonsmooth separable minimization
Mathematical Programming: Series A and B
Sparse reconstruction by separable approximation
IEEE Transactions on Signal Processing
Approximation accuracy, gradient methods, and error bound for structured convex optimization
Mathematical Programming: Series A and B - 20th International Symposium on Mathematical Programming – ISMP 2009
Mathematical Programming: Series A and B
Inexact Halpern-type proximal point algorithm
Journal of Global Optimization
Hi-index | 0.00 |
We consider a class of nonsmooth convex optimization problems where the objective function is the composition of a strongly convex differentiable function with a linear mapping, regularized by the group reproducing kernel norm. This class of problems arise naturally from applications in group Lasso, which is a popular technique for variable selection. An effective approach to solve such problems is by the proximal gradient method. In this paper we derive and study theoretically the efficient algorithms for the class of the convex problems, analyze the convergence of the algorithm and its subalgorithm.