Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
SIAM Journal on Optimization
Distributed optimization in sensor networks
Proceedings of the 3rd international symposium on Information processing in sensor networks
Fastest Mixing Markov Chain on a Graph
SIAM Review
A Convergent Incremental Gradient Method with a Constant Step Size
SIAM Journal on Optimization
Technical Communique: On decentralized negotiation of optimal consensus
Automatica (Journal of IFAC)
Hi-index | 0.00 |
We present an algorithm that generalizes the randomized incremental subgradient method with fixed stepsize due to Nedić and Bertsekas [SIAM J. Optim., 12 (2001), pp. 109-138]. Our novel algorithm is particularly suitable for distributed implementation and execution, and possible applications include distributed optimization, e.g., parameter estimation in networks of tiny wireless sensors. The stochastic component in the algorithm is described by a Markov chain, which can be constructed in a distributed fashion using only local information. We provide a detailed convergence analysis of the proposed algorithm and compare it with existing, both deterministic and randomized, incremental subgradient methods.