Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting

  • Authors:
  • Robert M. French;Nick Chater

  • Affiliations:
  • Quantitative Psychology and Cognitive Science, Psychology Department, University of Liege, 4000 Liège, Belgium;Institute for Applied Cognitive Science, Department of Psychology, University of Warwick Coventry, CV4 7AL, U.K.

  • Venue:
  • Neural Computation
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

In error-driven distributed feedforward networks, new information typically interferes, sometimes severely, with previously learned information. We show how noise can be used to approximate the error surface of previously learned information. By combining this approximated error surface with the error surface associated with the new information to be learned, the network's retention of previously learned items can be improved and catastrophic interference significantly reduced. Further, we show that the noise-generated error surface is produced using only first-derivative information and without recourse to any explicit error information.