A note on a globally convergent Newton method for solving monotone variational inequalities

  • Authors:
  • Patrice Marcotte;Jean-Pierre Dussault

  • Affiliations:
  • Collége Militaire Royal de Saint-Jean, Saint-Jean-sur-Richelieu, Que, Canada J1K 2R1 and GERAD, Ecole des Hautes Etudes Commerciales, Montreal, Que., Canada H3T 1V6;Département de Mathématiques et Informatique, Université de Sherbrooke, Sherbrooke, Que., Canada J1K 2R1

  • Venue:
  • Operations Research Letters
  • Year:
  • 1987

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is well-known (see Pang and Chan [8]) that Newton's method, applied to strongly monotone variational inequalities, is locally and quadratically convergent. In this paper we show that Newton's method yields a descent direction for a non-convex, non-differentiable merit function, even in the absence of strong monotonicity. This result is then used to modify Newton's method into a globally convergent algorithm by introducing a linesearch strategy. Furthermore, under strong monotonicity (i) the optimal face is attained after a finite number of iterations, (ii) the stepsize is eventually fixed to the value one, resulting in the usual Newton step. Computational results are presented.