Minimizing nonconvex functions for sparse vector reconstruction

  • Authors:
  • Nasser Mourad;James P. Reilly

  • Affiliations:
  • Department of Electrical and Computer Engineering, McMaster University, Hamilton, ON, Canada;Department of Electrical and Computer Engineering, McMaster University, Hamilton, ON, Canada

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2010

Quantified Score

Hi-index 35.68

Visualization

Abstract

In this paper, we develop a novel methodology for minimizing a class of nonconvex (concave on the non-negative orthant) functions for solving an underdetermined linear system of equations As=x when the solution vector sis known a priori to be sparse. The proposed technique is based on locally replacing the original objective function by a quadratic convex function which is easily minimized. The resulting algorithm is iterative and is absolutely converging to a fixed point of the original objective function. For a certain selection of convex objective functions, the class of algorithms called iterative reweighted least squares (IRLS) is shown to be a special case of the proposed methodology. Thus, the proposed algorithms are a generalization and unification of the previous methods. In addition, we also propose a new class of algorithms with better convergence properties compared to the regular IRLS algorithms and, hence, can be considered as enhancements to these algorithms. Since the original objective functions are nonconvex, the proposed algorithm is susceptible to convergence to a local minimum. To alleviate this difficulty, we propose a random perturbation technique that enhances the performance of the proposed algorithm. The numerical results show that the proposed algorithms outperform some of the well-known algorithms that are usually utilized for solving the same problem.