Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
On large scale nonlinear network optimization
Mathematical Programming: Series A and B
Convergence of some algorithms for convex minimization
Mathematical Programming: Series A and B - Special issue: Festschrift in Honor of Philip Wolfe part II: studies in nonlinear programming
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization
SIAM Journal on Optimization
Efficient dual methods for nonlinearly constrained networks
ICCSA'05 Proceedings of the 2005 international conference on Computational Science and Its Applications - Volume Part IV
Hi-index | 0.00 |
The efficiency of the network flow techniques can be exploited in the solution of nonlinearly constrained network flow problems (NCNFP) by means of approximate subgradient methods (ASM). We propose to solve the dual problem by an ASM that uses a variant of the well-known constant step rule of Shor. In this work the kind of convergence of this method is analyzed and its efficiency is compared with that of other approximate subgradient methods over NCNFP.