Neural network design and the complexity of learning
Neural network design and the complexity of learning
Training a 3-node neural network is NP-complete
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Advances in neural information processing systems 2
Generalization by weight-elimination with application to forecasting
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Artificial Intelligence Review - Special issue on lazy learning
IWANN '91 Proceedings of the International Workshop on Artificial Neural Networks
On-line learning with minimal degradation in feedforward networks
IEEE Transactions on Neural Networks
Neural computing increases robot adaptivity
Natural Computing: an international journal
Sequential Learning in Feedforward Networks: Proactive and Retroactive Interference Minimization
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Natural inspiration for artificial adaptivity: some neurocomputing experiences in robotics
UC'05 Proceedings of the 4th international conference on Unconventional Computation
Hi-index | 0.00 |
We analyze the conditions under which a memory system is prone to interference between new and old items. Essentially, these are the distributedness of the representation and the lack of retraining. Both are, however, desirable features providing compactness and speed. Thus, a two‐stage framework to palliate interference in this type of systems is proposed based on exploiting the information available at each moment. The two stages are separated by the instant at which a new item becomes known: (a) interference prevention, prior to that instant, consists in preparing the system to minimize the impact of learning new items and (b) retroactive interference minimization, posterior to that instant, seeks to learn the new item while minimizing the damages inflicted on the old items. The subproblems addressed at the two stages are stated rigorously and possible methods to solve each of them are presented.