The gap function of a convex program
Operations Research Letters
An additional projection step to He and Liao's method for solving variational inequalities
Journal of Computational and Applied Mathematics
Computers & Mathematics with Applications
A self-adaptive projection method with improved step-size for solving variational inequalities
Computers & Mathematics with Applications
A generalized proximal-point-based prediction-correction method for variational inequality problems
Journal of Computational and Applied Mathematics
Modified extragradient methods for solving variational inequalities
Computers & Mathematics with Applications
A new modified Goldstein-Levitin-Polyakprojection method for variational inequality problems
Computers & Mathematics with Applications
Gap functions and global error bounds for set-valued variational inequalities
Journal of Computational and Applied Mathematics
A modified inexact implicit method for mixed variational inequalities
Journal of Computational and Applied Mathematics
New decomposition methods for solving variational inequality problems
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
It is well-known (see Pang and Chan [8]) that Newton's method, applied to strongly monotone variational inequalities, is locally and quadratically convergent. In this paper we show that Newton's method yields a descent direction for a non-convex, non-differentiable merit function, even in the absence of strong monotonicity. This result is then used to modify Newton's method into a globally convergent algorithm by introducing a linesearch strategy. Furthermore, under strong monotonicity (i) the optimal face is attained after a finite number of iterations, (ii) the stepsize is eventually fixed to the value one, resulting in the usual Newton step. Computational results are presented.