ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part III
Another Simple Recurrent Neural Network for Quadratic and Linear Programming
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part III
IEEE Transactions on Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on game theory
A new one-layer neural network for linear and quadratic programming
IEEE Transactions on Neural Networks
A discrete-time neural network for optimization problems with hybrid constraints
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A dynamical model for solving degenerate quadratic minimax problems with constraints
Journal of Computational and Applied Mathematics
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
A capable neural network model for solving the maximum flow problem
Journal of Computational and Applied Mathematics
Information Sciences: an International Journal
An application of a merit function for solving convex programming problems
Computers and Industrial Engineering
Hi-index | 0.00 |
Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mx + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mx + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples.