Advances in neural information processing systems 2
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Training general dynamic neural networks
Training general dynamic neural networks
Penalty OBS Scheme for Feedforward Neural Network
ICTAI '05 Proceedings of the 17th IEEE International Conference on Tools with Artificial Intelligence
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Comments on "Backpropagation algorithms for a broad class of dynamic networks"
IEEE Transactions on Neural Networks
Optimal brain surgeon for general dynamic neural networks
EPIA'07 Proceedings of the aritficial intelligence 13th Portuguese conference on Progress in artificial intelligence
IEEE Transactions on Signal Processing
Backpropagation Algorithms for a Broad Class of Dynamic Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a pruning algorithm using adaptive pruning interval for system identification with general dynamic neural networks (GDNN). GDNNs are artificial neural networks with internal dynamics. All layers have feedback connections with time delays to the same and to all other layers. The parameters are trained with the Levenberg-Marquardt (LM) optimization algorithm. Therefore the Jacobian matrix is required. The Jacobian is calculated by real time recurrent learning (RTRL). As both LM and OBS need Hessian information, computing time can be saved, if OBS uses the scaled inverse Hessian already calculated for the LM algorithm. This paper discusses the effect of using the scaled Hessian instead of the real Hessian in the OBS pruning approach. In addition to that an adaptive pruning interval is introduced. Due to pruning the structure of the identification model is changed drastically. So the parameter optimization task between the pruning steps becomes more or less complex. To guarantee that the parameter optimization algorithm has enough time to cope with the structural changes in the GDNN-model, it is suggested to adapt the pruning interval during the identification process. The proposed algorithm is verified simulatively for two standard identification examples.