Pruning using parameter and neuronal metrics
Neural Computation
Accurate on-line support vector regression
Neural Computation
Constructive Incremental Learning from Only Local Information
Neural Computation
Incremental Leaning and Model Selection for Radial Basis Function Network through Sleep
IEICE - Transactions on Information and Systems
Fast learning in networks of locally-tuned processing units
Neural Computation
Evolving fuzzy neural networks for supervised/unsupervised onlineknowledge-based learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Incremental learning methods with retrieving of interfered patterns
IEEE Transactions on Neural Networks
A decade of Kasabov's evolving connectionist systems: a review
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
This paper proposes a biologically inspired incremental learning method for spatio-temporal patterns based on our recently reported "Incremental learning through sleep (ILS)" method. This method alternately repeats two learning phases: awake and sleep. During the awake phase, the system learns new spatio-temporal patterns by rote, whereas in the sleep phase, it rehearses the recorded new memories interleaved with old memories. The rehearsal process is essential for reconstructing the internal representation of the neural network so as not only to memorize the new patterns while keeping old memories but also to reduce redundant hidden units. By using this strategy, the neural network achieves high generalization ability. The most attractive property of the method is the incremental learning ability of non-independent distributed samples without catastrophic forgetting despite using a small amount of resources. We applied our method to an experiment on robot control signals, which vary depending on the context of the current situation.