2005 Special issue: Methods for reducing interference in the Complementary Learning Systems model: Oscillating inhibition and autonomous memory rehearsal

  • Authors:
  • Kenneth A. Norman;Ehren L. Newman;Adler J. Perotte

  • Affiliations:
  • Department of Psychology Princeton University, Green Hall, Princeton, NJ 08544, USA;Department of Psychology Princeton University, Green Hall, Princeton, NJ 08544, USA;Department of Psychology Princeton University, Green Hall, Princeton, NJ 08544, USA

  • Venue:
  • Neural Networks - Special issue: Computational theories of the functions of the hippocampus
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The stability-plasticity problem (i.e. how the brain incorporates new information into its model of the world, while at the same time preserving existing knowledge) has been at the forefront of computational memory research for several decades. In this paper, we critically evaluate how well the Complementary Learning Systems theory of hippocampo-cortical interactions addresses the stability-plasticity problem. We identify two major challenges for the model: Finding a learning algorithm for cortex and hippocampus that enacts selective strengthening of weak memories, and selective punishment of competing memories; and preventing catastrophic forgetting in the case of non-stationary environments (i.e. when items are temporarily removed from the training set). We then discuss potential solutions to these problems: First, we describe a recently developed learning algorithm that leverages neural oscillations to find weak parts of memories (so they can be strengthened) and strong competitors (so they can be punished), and we show how this algorithm outperforms other learning algorithms (CPCA Hebbian learning and Leabra at memorizing overlapping patterns. Second, we describe how autonomous re-activation of memories (separately in cortex and hippocampus) during REM sleep, coupled with the oscillating learning algorithm, can reduce the rate of forgetting of input patterns that are no longer present in the environment. We then present a simple demonstration of how this process can prevent catastrophic interference in an AB-AC learning paradigm. . We then present a simple demonstration of how this process can prevent catastrophic interference in an AB-AC learning paradigm. ate of forgetting of input patterns that are no longer present in the environment. We then present a simple demonstration of how this process can prevent catastrophic interference in an AB-AC learning paradigm.