Advances in neural information processing systems 2
Training general dynamic neural networks
Training general dynamic neural networks
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
IEEE Transactions on Signal Processing
Backpropagation Algorithms for a Broad Class of Dynamic Networks
IEEE Transactions on Neural Networks
Comments on "Backpropagation algorithms for a broad class of dynamic networks"
IEEE Transactions on Neural Networks
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Hi-index | 0.00 |
This paper presents a pruning algorithm based on optimal brain surgeon (OBS) for general dynamic neural networks (GDNN). The pruning algorithm uses Hessian information and considers the order of time delay for saliency calculation. In GDNNs all layers have feedback connections with time delays to the same and to all other layers. The parameters are trained with the Levenberg-Marquardt (LM) algorithm. Therefore the Jacobian matrix is required. The Jacobian is calculated by real time recurrent learning (RTRL). As both LM and OBS need Hessian information, a rational implementation is suggested.