Normalized incremental subgradient algorithm and its application

  • Authors:
  • Qingjiang Shi;Chen He;Lingge Jiang

  • Affiliations:
  • Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai, China;Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai, China;Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai, China

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 35.69

Visualization

Abstract

The problem of minimizing the sum of a number of component functions is of great importance in the real world. In this paper, a new incremental optimization algorithm, named normalized incremental subgradient (NIS) algorithm, is proposed for a class of such problems where the component functions have common local minima. The NIS algorithm is performed incrementally just as the general incremental subgradient (IS) algorithm and thus can be implemented in a distributed way. In the NIS algorithm, the update of each subiteration is based on a search direction obtained by individually normalizing each component of subgradients of component functions, resulting in much better convergence performance as compared to the IS algorithm and other traditional optimization methods (e.g., Gauss-Newton method). The convergence of the NIS algorithm with both diminishing stepsizes and constant stepsizes is proved and analyzed theoretically. Two important applications are presented. One is to solve a class of convex feasibility problems in a distributed way and the other is distributed maximum likelihood estimation. Numerical examples, arising from two important topics in the area of wireless sensor networks--source localization and node localization--demonstrate the effectiveness and efficiency of the NIS algorithm.