Asymptotic analysis for penalty and barrier methods in convex and linear programming
Mathematics of Operations Research
Coupling the proximal point algorithm with approximation methods
Journal of Optimization Theory and Applications
On the Existence and Convergence of the Central Path for Convex Programming and Some Duality Results
Computational Optimization and Applications
Proximal methods in view of interior-point strategies
Journal of Optimization Theory and Applications
Interior Methods for Nonlinear Optimization
SIAM Review
Viscosity Solutions of Minimization Problems
SIAM Journal on Optimization
Tubularity and Asymptotic Convergence of Penalty Trajectories in Convex Programming
SIAM Journal on Optimization
SIAM Journal on Optimization
SIAM Journal on Optimization
Numerical Optimization: Theoretical and Practical Aspects (Universitext)
Numerical Optimization: Theoretical and Practical Aspects (Universitext)
On the need for hybrid steps in hybrid proximal point methods
Operations Research Letters
Hi-index | 0.00 |
In order to minimize a closed convex function that is approximated by a sequence of better behaved functions, we investigate the global convergence of a general hybrid iterative algorithm, which consists of an inexact relaxed proximal point step followed by a suitable orthogonal projection onto a hyperplane. The latter permits to consider a fixed relative error criterion for the proximal step. We provide various sets of conditions ensuring the global convergence of this algorithm. The analysis is valid for nonsmooth data in infinite-dimensional Hilbert spaces. Some examples are presented, focusing on penalty/barrier methods in convex programming. We also show that some results can be adapted to the zero-finding problem for a maximal monotone operator.