A recurrent neural network for real-time matrix inversion
Applied Mathematics and Computation
Finite-Time Stability of Continuous Autonomous Systems
SIAM Journal on Control and Optimization
Solution of the matrix equation AX + XB = C [F4]
Communications of the ACM
Linear System Theory and Design
Linear System Theory and Design
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neuro-Dynamic Programming
Neural Networks for Combinatorial Optimization: a Review of More Than a Decade of Research
INFORMS Journal on Computing
Weighted least squares solutions to general coupled Sylvester matrix equations
Journal of Computational and Applied Mathematics
From Zhang neural network to Newton iteration for matrix inversion
IEEE Transactions on Circuits and Systems Part I: Regular Papers
Zhang neural network for online solution of time-varying sylvester equation
ISICA'07 Proceedings of the 2nd international conference on Advances in computation and intelligence
Expert Systems with Applications: An International Journal
Neural Computing and Applications
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Regularized image reconstruction using SVD and a neural networkmethod for matrix inversion
IEEE Transactions on Signal Processing
Noise-Robust Automatic Speech Recognition Using a Predictive Echo State Network
IEEE Transactions on Audio, Speech, and Language Processing
Design and analysis of a general recurrent neural network model for time-varying matrix inversion
IEEE Transactions on Neural Networks
Zhang Neural Network Versus Gradient Neural Network for Solving Time-Varying Linear Inequalities
IEEE Transactions on Neural Networks
Multi-level image thresholding by synergetic differential evolution
Applied Soft Computing
Hi-index | 0.00 |
Bartels---Stewart algorithm is an effective and widely used method with an O(n 3) time complexity for solving a static Sylvester equation. When applied to time-varying Sylvester equation, the computation burden increases intensively with the decrease of sampling period and cannot satisfy continuous realtime calculation requirements. Gradient-based recurrent neural network are able to solve the time-varying Sylvester equation in real time but there always exists an estimation error. In contrast, the recently proposed Zhang neural network has been proven to converge to the solution of the Sylvester equation ideally when time goes to infinity. However, this neural network with the suggested activation functions never converges to the desired value in finite time, which may limit its applications in realtime processing. To tackle this problem, a sign-bi-power activation function is proposed in this paper to accelerate Zhang neural network to finite-time convergence. The global convergence and finite-time convergence property are proven in theory. The upper bound of the convergence time is derived analytically. Simulations are performed to evaluate the performance of the neural network with the proposed activation function. In addition, the proposed strategy is applied to online calculating the pseudo-inverse of a matrix and nonlinear control of an inverted pendulum system. Both theoretical analysis and numerical simulations validate the effectiveness of proposed activation function.