Time-Scaling in Recurrent Neural Learning
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Information Sciences—Informatics and Computer Science: An International Journal - Special issue: Informatics and computer science intelligent systems applications
State-Space Recurrent Fuzzy Neural Networks for Nonlinear System Identification
Neural Processing Letters
Passivity Analysis of Dynamic Neural Networks with Different Time-scales
Neural Processing Letters
A new recurrent neurofuzzy network for identification of dynamic systems
Fuzzy Sets and Systems
A differential adaptive learning rate method for back-propagation neural networks
NN'09 Proceedings of the 10th WSEAS international conference on Neural networks
Stable Fourier neural networks with application to modeling lettuce growth
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A robust extended Elman backpropagation algorithm
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
On the weight convergence of Elman networks
IEEE Transactions on Neural Networks
Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay
Information Sciences: an International Journal
Passivity analysis for neuro identifier with different time-scales
ICIC'06 Proceedings of the 2006 international conference on Intelligent Computing - Volume Part I
Passivity analysis of dynamic neural networks with different time-scales
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Computers & Mathematics with Applications
Two Types of Haar Wavelet Neural Networks for Nonlinear System Identification
Neural Processing Letters
Hi-index | 0.00 |
To avoid unstable phenomenon during the learning process, two new learning schemes, called the multiplier and constrained learning rate algorithms, are proposed in this paper to provide stable adaptive updating processes for both the synaptic and somatic parameters of the network. Based on the explicit stability conditions, in the multiplier method these conditions are introduced into the iterative error index, and the new updating formulations contain a set of inequality constraints. In the constrained learning rate algorithm, the learning rate is updated at each iterative instant by an equation derived using the stability conditions. With these stable dynamic backpropagation algorithms, any analog target pattern may be implemented by a steady output vector which is a nonlinear vector function of the stable equilibrium point. The applicability of the approaches presented is illustrated through both analog and binary pattern storage examples