Mathematical Programming: Series A and B
Iterative solution of nonlinear equations in several variables
Iterative solution of nonlinear equations in several variables
Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
Smoothing Functions for Second-Order-Cone Complementarity Problems
SIAM Journal on Optimization
An unconstrained smooth minimization reformulation of the second-order cone complementarity problem
Mathematical Programming: Series A and B
A neural network algorithm for second-order conic programming
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Neural networks for solving second-order cone constrained variational inequality problem
Computational Optimization and Applications
Fast Computation of Optimal Contact Forces
IEEE Transactions on Robotics
A neural network for the linear complementarity problem
Mathematical and Computer Modelling: An International Journal
A new neural network for solving linear and quadratic programming problems
IEEE Transactions on Neural Networks
A Lagrangian network for kinematic control of redundant robot manipulators
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A recurrent neural network for solving nonlinear convex programs subject to linear constraints
IEEE Transactions on Neural Networks
Solving general convex nonlinear optimization problems by an efficient neurodynamic model
Engineering Applications of Artificial Intelligence
Information Sciences: an International Journal
Hi-index | 0.01 |
This paper proposes using the neural networks to efficiently solve the second-order cone programs (SOCP). To establish the neural networks, the SOCP is first reformulated as a second-order cone complementarity problem (SOCCP) with the Karush-Kuhn-Tucker conditions of the SOCP. The SOCCP functions, which transform the SOCCP into a set of nonlinear equations, are then utilized to design the neural networks. We propose two kinds of neural networks with the different SOCCP functions. The first neural network uses the Fischer-Burmeister function to achieve an unconstrained minimization with a merit function. We show that the merit function is a Lyapunov function and this neural network is asymptotically stable. The second neural network utilizes the natural residual function with the cone projection function to achieve low computation complexity. It is shown to be Lyapunov stable and converges globally to an optimal solution under some condition. The SOCP simulation results demonstrate the effectiveness of the proposed neural networks.