A Hypothesis on How the Neocortex Extracts Information for Prediction in Sequence Learning
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Spatio-temporal memories for machine learning: a long-term memory organization
IEEE Transactions on Neural Networks
Robotics and Autonomous Systems
Hi-index | 0.00 |
Temporal sequence learning is one of the most critical components for human intelligence. In this paper, a novel hierarchical structure for complex temporal sequence learning is proposed. Hierarchical organization, a prediction mechanism, and one-shot learning characterize the model. In the lowest level of the hierarchy, we use a modified Hebbian learning mechanism for pattern recognition. Our model employs both active 0 and active 1 sensory inputs. A winner-take-all (WTA) mechanism is used to select active neurons that become the input for sequence learning at higher hierarchical levels. Prediction is an essential element of our temporal sequence learning model. By correct prediction, the machine indicates it knows the current sequence and does not require additional learning. When the prediction is incorrect, one-shot learning is executed and the machine learns the new input sequence as soon as the sequence is completed. A four-level hierarchical structure that isolates letters, words, sentences, and strophes is used in this paper to illustrate the model