Parallel Optimization: Theory, Algorithms and Applications
Parallel Optimization: Theory, Algorithms and Applications
A New Class of Incremental Gradient Methods for Least Squares Problems
SIAM Journal on Optimization
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Distributed localization in wireless sensor networks: a quantitative comparison
Computer Networks: The International Journal of Computer and Telecommunications Networking - Special issue: Wireless sensor networks
Distributed optimization in sensor networks
Proceedings of the 3rd international symposium on Information processing in sensor networks
Semidefinite programming for ad hoc wireless sensor network localization
Proceedings of the 3rd international symposium on Information processing in sensor networks
Convex Optimization
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Robust distributed node localization with error management
Proceedings of the 7th ACM international symposium on Mobile ad hoc networking and computing
Distributed weighted-multidimensional scaling for node localization in sensor networks
ACM Transactions on Sensor Networks (TOSN)
Semidefinite programming based algorithms for sensor network localization
ACM Transactions on Sensor Networks (TOSN)
Energy-based collaborative source localization using acoustic microsensor array
EURASIP Journal on Applied Signal Processing
IEEE Transactions on Signal Processing
Energy-based sensor network source localization via projection onto convex sets
IEEE Transactions on Signal Processing
Quantized incremental algorithms for distributed optimization
IEEE Journal on Selected Areas in Communications
Distributed wireless sensor network localization via sequential greedy optimization algorithm
IEEE Transactions on Signal Processing
Hi-index | 35.69 |
The problem of minimizing the sum of a number of component functions is of great importance in the real world. In this paper, a new incremental optimization algorithm, named normalized incremental subgradient (NIS) algorithm, is proposed for a class of such problems where the component functions have common local minima. The NIS algorithm is performed incrementally just as the general incremental subgradient (IS) algorithm and thus can be implemented in a distributed way. In the NIS algorithm, the update of each subiteration is based on a search direction obtained by individually normalizing each component of subgradients of component functions, resulting in much better convergence performance as compared to the IS algorithm and other traditional optimization methods (e.g., Gauss-Newton method). The convergence of the NIS algorithm with both diminishing stepsizes and constant stepsizes is proved and analyzed theoretically. Two important applications are presented. One is to solve a class of convex feasibility problems in a distributed way and the other is distributed maximum likelihood estimation. Numerical examples, arising from two important topics in the area of wireless sensor networks--source localization and node localization--demonstrate the effectiveness and efficiency of the NIS algorithm.