Neural computing increases robot adaptivity
Natural Computing: an international journal
Sequential Learning in Feedforward Networks: Proactive and Retroactive Interference Minimization
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Anticipation Model for Sequential Learning of Complex Sequences
Sequence Learning - Paradigms, Algorithms, and Applications
Mathematics and Computers in Simulation
A framework to deal with interference in connectionist systems
AI Communications
A SOM-based data mining strategy for adaptive modelling of an offset lithographic printing process
Engineering Applications of Artificial Intelligence
Natural inspiration for artificial adaptivity: some neurocomputing experiences in robotics
UC'05 Proceedings of the 4th international conference on Unconventional Computation
Hi-index | 0.00 |
Dealing with nonstationary processes requires quick adaptation while at the same time avoiding catastrophic forgetting. A neural learning technique that satisfies these requirements, without sacrificing the benefits of distributed representations, is presented. It relies on a formalization of the problem as the minimization of the error over the previously learned input-output patterns, subject to the constraint of perfect encoding of the new pattern. Then this constrained optimization problem is transformed into an unconstrained one with hidden-unit activations as variables. This new formulation leads to an algorithm for solving the problem, which we call learning with minimal degradation (LMD). Some experimental comparisons of the performance of LMD with backpropagation are provided which, besides showing the advantages of using LMD, reveal the dependence of forgetting on the learning rate in backpropagation. We also explain why overtraining affects forgetting and fault tolerance, which are seen as related problems