A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
Newton's method for B-differentiable equations
Mathematics of Operations Research
Mathematical Programming: Series A and B
On stationary points of the implicit Lagrangian for nonlinear complementarity problems
Journal of Optimization Theory and Applications
Growth behavior of a class of merit functions for the nonlinear complementarity problem
Journal of Optimization Theory and Applications
On the resolution of monotone complementarity problems
Computational Optimization and Applications
Nonlinear complementarity as unconstrained optimization
Journal of Optimization Theory and Applications
Modified Newton methods for solving a semismooth reformulation of monotone complementarity problems
Mathematical Programming: Series A and B - Special issue on computational nonsmooth optimization
A Comparison of Large Scale Mixed Complementarity Problem Solvers
Computational Optimization and Applications
New NCP-functions and their properties
Journal of Optimization Theory and Applications
Computational Optimization and Applications - Special issue on computational optimization—a tribute to Olvi Mangasarian, part II
A New Merit Function For Nonlinear Complementarity Problems And A Related Algorithm
SIAM Journal on Optimization
Journal of Global Optimization
A family of NCP functions and a descent method for the nonlinear complementarity problem
Computational Optimization and Applications
Information Sciences: an International Journal
A non-interior continuation algorithm for the CP based on a generalized smoothing function
Journal of Computational and Applied Mathematics
A new class of penalized NCP-functions and its properties
Computational Optimization and Applications
Neural networks for solving second-order cone constrained variational inequality problem
Computational Optimization and Applications
Hi-index | 7.29 |
In the paper [J.-S. Chen, S. Pan, A family of NCP-functions and a descent method for the nonlinear complementarity problem, Computational Optimization and Applications, 40 (2008) 389-404], the authors proposed a derivative-free descent algorithm for nonlinear complementarity problems (NCPs) by the generalized Fischer-Burmeister merit function: @j"p(a,b)=12[@?(a,b)@?"p-(a+b)]^2, and observed that the choice of the parameter p has a great influence on the numerical performance of the algorithm. In this paper, we analyze the phenomenon theoretically for a derivative-free descent algorithm which is based on a penalized form of @j"p and uses a different direction from that of Chen and Pan. More specifically, we show that the algorithm proposed is globally convergent and has a locally R-linear convergence rate, and furthermore, its convergence rate will become worse when the parameter p decreases. Numerical results are also reported for the test problems from MCPLIB, which further verify the theoretical results obtained.