A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
Introduction to the theory of neural computation
Introduction to the theory of neural computation
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Neural Computation
A “thermal” perceptron learning rule
Neural Computation
Neural Networks
The consolidation of learning during sleep: comparing the pseudorehearsal and unlearning accounts
Neural Networks - Special issue on organisation of computation in brain-like systems
Selective transfer of neural network task knowledge
Selective transfer of neural network task knowledge
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
A model updating strategy for predicting time series with seasonal patterns
Applied Soft Computing
Protein contact map prediction using committee machine approach
International Journal of Data Mining and Bioinformatics
Hi-index | 0.01 |
In this review we explore the topic of sequential learning, where information to be learned and retained arrives in separate episodes over time, in the context of artificial neural networks. Most neural networks handle this kind of task very badly, as new learning completely disrupts information previously learned by the network. This problem, known as "catastrophic forgetting", has received a lot of attention in the literature. We illustrate the catastrophic forgetting effect, and summarise possible solutions. In particular, we review the literature relating to the pseudorehearsal mechanism, which is an effective solution to the catastrophic forgetting problem in back propagation type networks. We then review similar issues of capacity, forgetting, and the use of pseudorehearsal in Hopfield type networks. Finally, we briefly discuss these issues in the context of cognition, and summarise interesting topics for further research.