Topics in matrix analysis
Analog VLSI and neural systems
Analog VLSI and neural systems
A recurrent neural network for real-time matrix inversion
Applied Mathematics and Computation
Analysis of noise-induced phase synchronization in nervous systems: from algorithmic perspective
Information Processing Letters
On the stability analysis of nonlinear systems using polynomial Lyapunov functions
Mathematics and Computers in Simulation
Finding iterative roots with a spiking neural network
Information Processing Letters - Special issue on applications of spiking neural networks
A gradient descent rule for spiking neurons emitting multiple spikes
Information Processing Letters - Special issue on applications of spiking neural networks
A recurrent neural network for solving Sylvester equation with time-varying coefficients
IEEE Transactions on Neural Networks
Design and analysis of a general recurrent neural network model for time-varying matrix inversion
IEEE Transactions on Neural Networks
Mathematics and Computers in Simulation
Improved neural solution for the Lyapunov matrix equation based on gradient search
Information Processing Letters
Hi-index | 0.89 |
By adding different activation functions, a type of gradient-based neural networks is developed and presented for the online solution of Lyapunov matrix equation. Theoretical analysis shows that any monotonically-increasing odd activation function could be used for the construction of neural networks, and the improved neural models have the global convergence performance. For the convenience of hardware realization, the schematic circuit is given for the improved neural solvers. Computer simulation results further substantiate that the improved neural networks could solve the Lyapunov matrix equation with accuracy and effectiveness. Moreover, when using the power-sigmoid activation functions, the improved neural networks have superior convergence when compared to linear models.