A nonsmooth version of Newton's method
Mathematical Programming: Series A and B
Convergence analysis of some algorithms for solving nonsmooth equations
Mathematics of Operations Research
A trust region algorithm for minimization of locally Lipschitzian functions
Mathematical Programming: Series A and B
Convergence of some algorithms for convex minimization
Mathematical Programming: Series A and B - Special issue: Festschrift in Honor of Philip Wolfe part II: studies in nonlinear programming
A globally convergent Newton method for convex SC1minimization problems
Journal of Optimization Theory and Applications
A family of variable metric proximal methods
Mathematical Programming: Series A and B
A unified approach to global convergence of trust region methods for nonsmooth optimization
Mathematical Programming: Series A and B
Mathematical Programming: Series A and B
Convergence analysis of some methods for minimizing a nonsmooth convex function
Journal of Optimization Theory and Applications
Globally convergent variable metric method for convex nonsmooth unconstrained minimization
Journal of Optimization Theory and Applications
Trust-region methods
A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
SIAM Journal on Optimization
Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
SIAM Journal on Optimization
Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
Mathematical Programming: Series A and B
Cubic regularization of Newton method and its global performance
Mathematical Programming: Series A and B
Accelerating the cubic regularization of Newton’s method on convex problems
Mathematical Programming: Series A and B
Lagrangian-Dual Functions and Moreau-Yosida Regularization
SIAM Journal on Optimization
Mathematical Programming: Series A and B
Mathematical Programming: Series A and B
Hi-index | 0.00 |
By using the Moreau-Yosida regularization and proximal method, a new trust region algorithm is proposed for nonsmooth convex minimization. A cubic subproblem with adaptive parameter is solved at each iteration. The global convergence and Q-superlinear convergence are established under some suitable conditions. The overall iteration bound of the proposed algorithm is discussed. Preliminary numerical experience is reported.