Circuits of the mind
Self-organizing maps
The handbook of brain theory and neural networks
Hi-index | 0.00 |
Understanding how sequences are learned and encoded is a key component to understanding cognition. We present a recruitment model in which sequences are learned via the hierarchical binding of features across time. Learning in the model is unsupervised and occurs within a single presentation of the input. The topology and learning mechanisms allow the network to exploit the temporal structure of the input in order to recruit localized representations of sequences, using leaky integrate-and-fire neurons with biologically-grounded learning mechanisms. The model learns a temporal XOR-style task, and ablation tests are performed to justify the inclusion of particular features in the model. The model is then extended and applied to the task of learning 7-digit sequences. Both sets of simulations demonstrate the ability of the model to acquire and reuse chunks. Limitations and future extensions of the model are then discussed.