On Chaos and Neural Networks: The Backpropagation Paradigm

  • Authors:
  • K. Bertels;L. Neuberg;S. Vassiliadis;D. G. Pechanek

  • Affiliations:
  • University of Namur, Dept. of Business Administration, Rempart de la Vierge 8, 5000 Namur, Belgium;University of Namur, Dept. of Business Administration, Rempart de la Vierge 8, 5000 Namur, Belgium;T.U. Delft, Electrical Engineering Department, Mekelweg 4, 2628 CD Delft, The Netherlands;IBM Microelectronics Division, Research Triangle Park, North Carolina 27709, USA

  • Venue:
  • Artificial Intelligence Review
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

In training feed-forward neural networks using the backpropagation algorithm, a sensitivity to the values of the parameters of the algorithm hasbeen observed. In particular, it has been observed that this sensitivity with respect to the values of the parameters, such as thelearning rate, plays an important role in the final outcome. In thistutorial paper, we will look at neural networks from a dynamical systemspoint of view andexamine its properties. To this purpose, we collect results regarding chaostheory as well as the backpropagation algorithmand establish a relationship between them. We study in detail as an example the learning of the exclusive OR,an elementary Boolean function. The following conclusions hold for our XOR neural network: no chaos appears for learning rates lower than 5, when chaosoccurs, it disappears as learning progresses. For non-chaotic learning rates, the network learns faster than for other learning rates for which chaos occurs.