Mathematical Programming: Series A and B
Mathematical Programming: Series A and B
An infeasible-interior-point algorithm for linear complementarity problems
Mathematical Programming: Series A and B
Solving nonlinear complementarity problems with neural networks: a reformulation method approach
Journal of Computational and Applied Mathematics
Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
Smoothing Functions for Second-Order-Cone Complementarity Problems
SIAM Journal on Optimization
Stability Analysis of Gradient-Based Neural Networks for Optimization Problems
Journal of Global Optimization
An unconstrained smooth minimization reformulation of the second-order cone complementarity problem
Mathematical Programming: Series A and B
Journal of Global Optimization
A family of NCP functions and a descent method for the nonlinear complementarity problem
Computational Optimization and Applications
Journal of Computational and Applied Mathematics
Computers & Mathematics with Applications
Information Sciences: an International Journal
Computational Optimization and Applications
A Recurrent Neural Network for Solving a Class of General Variational Inequalities
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Solving asymmetric variational inequalities via convex optimization
Operations Research Letters
IEEE Transactions on Neural Networks
A recurrent neural network for solving nonlinear convex programs subject to linear constraints
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Solving linear programming problems with neural networks: a comparative study
IEEE Transactions on Neural Networks
An application of a merit function for solving convex programming problems
Computers and Industrial Engineering
Information Sciences: an International Journal
Hi-index | 0.00 |
In this paper, we consider using the neural networks to efficiently solve the second-order cone constrained variational inequality (SOCCVI) problem. More specifically, two kinds of neural networks are proposed to deal with the Karush-Kuhn-Tucker (KKT) conditions of the SOCCVI problem. The first neural network uses the Fischer-Burmeister (FB) function to achieve an unconstrained minimization which is a merit function of the Karush-Kuhn-Tucker equation. We show that the merit function is a Lyapunov function and this neural network is asymptotically stable. The second neural network is introduced for solving a projection formulation whose solutions coincide with the KKT triples of SOCCVI problem. Its Lyapunov stability and global convergence are proved under some conditions. Simulations are provided to show effectiveness of the proposed neural networks.