Sensitivity analysis of nonlinear programs and differentiability properties of metric projections
SIAM Journal on Control and Optimization
USSR Computational Mathematics and Mathematical Physics
A globally convergent Newton method for solving strongly monotone variational inequalities
Mathematical Programming: Series A and B
Comparison of Two Kinds of Prediction-Correction Methods for Monotone Variational Inequalities
Computational Optimization and Applications
Journal of Global Optimization
Mathematical and Computer Modelling: An International Journal
Hi-index | 7.29 |
The extragradient type methods are a class of efficient direct methods. For solving monotone variational inequalities, these methods only require function evaluation, and therefore are widely applied to black-box models. In this type of methods, the distance between the iterate and a fixed solution point decreases by iterations. Furthermore, in each iteration, the negative increment of such squared distance has a differentiable concave lower bound function without requiring any solution in its formula. In this paper, we investigate some properties for the lower bound. Our study reveals that the lower bound affords a steplength domain which guarantees the convergence of the entire algorithm. Based on these results, we present two new steplengths. One involves the projection onto the tangent cone without line search, while the other can be computed via searching the positive root of a one dimension concave lower bound function. Our preliminary numerical results confirm and illustrate the attractiveness of our contributions.