An Analysis on the Performance of Silicon Implementations of Backpropagation Algorithms for Artificial Neural Networks

  • Authors:
  • Leonardo M. Reyneri;Enrica Filippi

  • Affiliations:
  • -;-

  • Venue:
  • IEEE Transactions on Computers - Special issue on artificial neural networks
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

The effects of silicon implementation on the backpropagation learning rule in artificial neural systems are examined. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20/22 bits is generally required, but this figure can be reduced to about 14/15 bits by properly choosing the learning parameter eta which attains good performance in presence of limited resolution. This performance can be further improved by using a modified batch backpropagation rule. Theoretical analysis is compared with ad-hoc simulations and results are discussed in detail.