Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
Journal of Global Optimization
A neural root finder of polynomials based on root moments
Neural Computation
A recurrent neural network for linear fractional programming with bound constraints
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Dynamical system for computing largest generalized eigenvalue
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
A recurrent neural network for extreme eigenvalue problem
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
A general methodology for designing globally convergent optimization neural networks
IEEE Transactions on Neural Networks
A constructive approach for finding arbitrary roots of polynomials by neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Neurodynamic analysis for solving the Schur decomposition of the box problems is presented in this paper. By constructing a number of dynamic systems, all the eigenvectors of a given matrix pair (A,B) can be searched and thus the decomposition realized. Each constructed dynamical system is demonstrated to be globally convergent to an exact eigenvector of the matrix box pair (A,B). It is also demonstrated that the dynamical systems are primal in the sense of the neural trajectories never escape from the feasible region when starting at it. Compared with the existing neural network models for the generalized eigenvalue problems, the proposed neurodynamic approach has two advantages: 1) it can find all the eigenvectors and 2) all the proposed systems globally converge to the problem's exact eigenvectors.