On the choice of parameters of the cost function in nested modular RNN's

  • Authors:
  • D. P. Mandic;J. A. Chambers

  • Affiliations:
  • Sch. of Inf. Syst., East Anglia Univ., Norwich;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

We address the choice of the coefficients in the cost function of a modular nested recurrent neural-network (RNN) architecture, known as the pipelined recurrent neural network (PRNN). Such a network can cope with the problem of vanishing gradient, experienced in prediction with RNN's. Constraints on the coefficients of the cost function, in the form of a vector norm, are considered. Unlike the previous cost function for the PRNN, which included a forgetting factor motivated by the recursive least squares (RLS) strategy, the proposed forms of cost function provide “forgetting” of the outputs of adjacent modules based upon the network architecture. Such an approach takes into account the number of modules in the PRNN, through the unit norm constraint on the coefficients of the cost function of the PRNN. This is shown to be particularly suitable, since due to inherent nesting in the PRNN, every module gives its full contribution to the learning process, whereas the unit norm constrained cost function introduces a sense of forgetting in the memory management of the PRNN. The PRNN based upon a modified cost function outperforms existing PRNN schemes in the time series prediction simulations presented