Recurrent neural networks with trainable amplitude of activation functions

  • Authors:
  • Su Lee Goh;Danilo P. Mandic

  • Affiliations:
  • Imperial Collge of Science, Technology and Medicine, London SW7 2AZ, UK;Imperial Collge of Science, Technology and Medicine, London SW7 2AZ, UK

  • Venue:
  • Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

An adaptive amplitude real time recurrent learning (AARTRL) algorithm for fully connected recurrent neural networks (RNNs) employed as nonlinear adaptive filters is proposed. Such an algorithm is beneficial when dealing with signals that have rich and unknown dynamical characteristics. Following the approach from [Trentin, E. Network with trainable amplitude of activation functions, Neural Networks 14 (2001) 471], three different cases for the algorithm are considered; a common adaptive amplitude shared among all the neurons; each layer has its own adaptive amplitude; different adaptive amplitude for each neuron. Experimental results show the AARTRL outperforms the standard RTRL algorithm.