Equivalence of backpropagation and contrastive Hebbian learning in a layered network

  • Authors:
  • Xiaohui Xie;H. Sebastian Seung

  • Affiliations:
  • Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA;Howard Hughes Medical Institute and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA

  • Venue:
  • Neural Computation
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Backpropagation and contrastive Hebbian learning are two methods of training networks with hidden neurons. Backpropagation computes an error signal for the output neurons and spreads it over the hidden neurons. Contrastive Hebbian learning involves clamping the output neurons at desired values and letting the effect spread through feedback connections over the entire network. To investigate the relationship between these two forms of learning, we consider a special case in which they are identical: a multilayer perceptron with linear output units, to which weak feedback connections have been added. In this case, the change in network state caused by clamping the output neurons turns out to be the same as the error signal spread by backpropagation, except for a scalar prefactor. This suggests that the functionality of backpropagation can be realized alternatively by a Hebbian-type learning algorithm, which is suitable for implementation in biological networks.