Parallel distributed processing: explorations in the microstructure of cognition, vol. 2: psychological and biological models
Cognitive science: an introduction
Cognitive science: an introduction
Neural Computation
Unsupervised Learning in LSTM Recurrent Neural Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Neural Networks - 2005 Special issue: IJCNN 2005
A Novel Connectionist System for Unconstrained Handwriting Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental learning of complex temporal patterns
IEEE Transactions on Neural Networks
LSTM recurrent networks learn simple context-free and context-sensitive languages
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Recurrent Neural Networks (RNNs) have shown good results with real-world temporal contextual data, but for input sequences with long time lags they fail. Long Short Term Memory (LSTM) model was built to address the issue of large time lags in input data successfully. However, LSTM found lacking for the tasks pertaining to lower level cognitive processing, specifically, information processing, storage & recall, and also whether they could learn in an unsupervised manner. Sustained Temporal Order Recurrent (STORE) networks are designed to encode the order of temporal data, and then could recall the encoded data in veridical as well as non-veridical order employing unsupervised learning. In this research we have propose the Fusion of supervised learning base LSTM propose by Jurgen and unsupervised learning based STORE proposed by Grossberg. To alternate between two approaches as well as mimicking brain in information processing during sleep time (internal input) we proposed CCS (Consolidation Control Unit), built on an in-depth cognitive foundation, to overcome the inability of LSTM to learn in unsupervised manner and to work with lower level cognitive processing. We conclude by providing experimental proof of the efficiency of proposed model by comparing it with original LSTM model.