Implementing plastic weights in neural networks using low precision arithmetic

  • Authors:
  • Christopher Johansson;Anders Lansner

  • Affiliations:
  • School of Computer Science and Communication, Royal Institute of Technology, Lindstedtsv. 3 Stockholm, 100 44, Sweden;School of Computer Science and Communication, Royal Institute of Technology, Lindstedtsv. 3 Stockholm, 100 44, Sweden

  • Venue:
  • Neurocomputing
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this letter, we develop a fixed-point arithmetic, low precision, implementation of an exponentially weighted moving average (EWMA) that is used in a neural network with plastic weights. We analyze the proposed design both analytically and experimentally, and we also evaluate its performance in the application of an attractor neural network. The EWMA in the proposed design has a constant relative truncation error, which is important for avoiding round-off errors in applications with slowly decaying processes, e.g. connectionist networks. We conclude that the proposed design offers greatly improved memory and computational efficiency compared to a naive implementation of the EWMA's difference equation, and that it is well suited for implementation in digital hardware.