Computers & Mathematics with Applications
Convergence of stochastic gradient estimation algorithm for multivariable ARX-like systems
Computers & Mathematics with Applications
Gradient-based maximal convergence rate iterative method for solving linear matrix equations
International Journal of Computer Mathematics
Information Processing Letters
Improved gradient-based neural networks for online solution of Lyapunov matrix equation
Information Processing Letters
Design and analysis of a general recurrent neural network model for time-varying matrix inversion
IEEE Transactions on Neural Networks
Dynamics of a mean-shift-like algorithm and its applications on clustering
Information Processing Letters
Hi-index | 0.89 |
By using the hierarchical identification principle, based on the conventional gradient search, two neural subsystems are developed and investigated for the online solution of the well-known Lyapunov matrix equation. Theoretical analysis shows that, by using any monotonically-increasing odd activation function, the gradient-based neural networks (GNN) can solve the Lyapunov equation exactly and efficiently. Computer simulation results confirm that the solution of the presented GNN models could globally converge to the solution of the Lyapunov matrix equation. Moreover, when using the power-sigmoid activation functions, the GNN models have superior convergence when compared to linear models.