Backpropagation Algorithms for a Broad Class of Dynamic Networks

  • Authors:
  • Orlando De Jesus;M. T. Hagan

  • Affiliations:
  • Res. Dept., Halliburton Energy Services, Dallas, TX;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces a general framework for describing dynamic neural networks-the layered digital dynamic network (LDDN). This framework allows the development of two general algorithms for computing the gradients and Jacobians for these dynamic networks: backpropagation-through-time (BPTT) and real-time recurrent learning (RTRL). The structure of the LDDN framework enables an efficient implementation of both algorithms for arbitrary dynamic networks. This paper demonstrates that the BPTT algorithm is more efficient for gradient calculations, but the RTRL algorithm is more efficient for Jacobian calculations