Solving structured sparsity regularization with proximal methods

  • Authors:
  • Sofia Mosci;Lorenzo Rosasco;Matteo Santoro;Alessandro Verri;Silvia Villa

  • Affiliations:
  • Università degli Studi di Genova, DISI, Genova, Italy;Istituto Italiano di Tecnologia, Genova, Italy and CBCL, Massachusetts Institute of Technology, Cambridge, MA;Università degli Studi di Genova, DISI, Genova, Italy;Università degli Studi di Genova, DISI, Genova, Italy;Università degli Studi di Genova, DIMA, Genova, Italy

  • Venue:
  • ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the l1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine learning algorithms which can be seen as extensions of l1 regularization, namely structured sparsity regularization. For all these algorithms, it is possible to derive an optimization procedure which corresponds to an iterative projection algorithm. Second, we discuss the effect of a preconditioning of the optimization procedure achieved by adding a strictly convex functional to the objective function. Structured sparsity algorithms are usually based on minimizing a convex (not strictly convex) objective function and this might lead to undesired unstable behavior. We show that by perturbing the objective function by a small strictly convex term we often reduce substantially the number of required computations without affecting the prediction performance of the obtained solution.