Anticipation Model for Sequential Learning of Complex Sequences
Sequence Learning - Paradigms, Algorithms, and Applications
Neural Networks - 2004 Special issue: New developments in self-organizing systems
A Hypothesis on How the Neocortex Extracts Information for Prediction in Sequence Learning
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Design of adaptive filter using Jordan/Elman neural network in a typical EMG signal noise removal
Advances in Artificial Neural Systems
Spatio-temporal memories for machine learning: a long-term memory organization
IEEE Transactions on Neural Networks
An unsupervised learning based LSTM model: a new architecture
AMERICAN-MATH'11/CEA'11 Proceedings of the 2011 American conference on applied mathematics and the 5th WSEAS international conference on Computer engineering and applications
Hi-index | 0.00 |
A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of intact sequences increases linearly with the number of previously acquired sequences, the amount of retraining due to interference appears to be independent of the size of existing memory. The model is extended to include a chunking network which detects repeated subsequences between and within sequences. The chunking mechanism substantially reduces the amount of retraining in sequential training. Thus, the network investigated here constitutes an effective sequential memory. Various aspects of such a memory are discussed