Letter To The Editor: Error in proof of exponential convergence
Neural Networks
A new neural network for solving nonlinear projection equations
Neural Networks
A dynamical model for solving degenerate quadratic minimax problems with constraints
Journal of Computational and Applied Mathematics
Hopfield neural networks with unbounded monotone activation functions
Advances in Artificial Neural Systems
Hi-index | 0.00 |
A recurrent neural network is presented which performs quadratic optimization subject to bound constraints on each of the optimization variables. The network is shown to be globally convergent, and conditions on the quadratic problem and the network parameters are established under which exponential asymptotic stability is achieved. Through suitable choice of the network parameters, the system of differential equations governing the network activations is preconditioned in order to reduce its sensitivity to noise and to roundoff errors. The optimization method employed by the neural network is shown to fall into the general class of gradient methods for constrained nonlinear optimization and, in contrast with penalty function methods, is guaranteed to yield only feasible solutions