Matrix analysis
A Complex-Valued RTRL Algorithm for Recurrent Neural Networks
Neural Computation
Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models
Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models
When does online BP training converge?
IEEE Transactions on Neural Networks
A modified gradient-based neuro-fuzzy learning algorithm and its convergence
Information Sciences: an International Journal
The complex backpropagation algorithm
IEEE Transactions on Signal Processing
Stability of steepest descent with momentum for quadratic functions
IEEE Transactions on Neural Networks
Convergence of gradient method with momentum for two-Layer feedforward neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The momentum method is a commonly used method to accelerate the learning of neural networks. In this paper, a new adaptive momentum algorithm is proposed for split-complex recurrent neural networks training. Different from other momentum methods, this new algorithm uses a variable gain factor and a variable learning rate to speed up the convergence and smooth the weight trace. The global convergence of the new algorithm is proved under mild conditions. Numerical results show that the algorithm is efficient for the given test problems.