Modeling brain function—the world of attractor neural networks
Modeling brain function—the world of attractor neural networks
Unified theories of cognition
Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
The computational brain
A model of cortical associative memory based on Hebbian cell assemblies
selected papers from the Swedish conference on Connectionism in a broad perspective
Robust reasoning: integrating rule-based and similarity-based reasoning
Artificial Intelligence
ACM Transactions on Computer-Human Interaction (TOCHI)
Memory mainetenance via neuronal regulation
Neural Computation
Neural networks with dynamic synapses
Neural Computation
Cortical columns, modules, and Hebbian cell assemblies
The handbook of brain theory and neural networks
Pulsed Neural Networks
Introduction to the Theory of Neural Computation
Introduction to the Theory of Neural Computation
Conceptual Clustering, Categorization, and Polymorphy
Machine Learning
On the emergence of rules in neural networks
Neural Computation
Cell Assemblies as an Intermediate Level Model of Cognition
Emergent Neural Computational Architectures Based on Neuroscience - Towards Neuroscience-Inspired Computing
Spike-driven synaptic dynamics generating working memory states
Neural Computation
An analysis of hebb's cell assembly as a mechanism for perceptual generalization
An analysis of hebb's cell assembly as a mechanism for perceptual generalization
Structural ambiguity and lexical relations
Computational Linguistics - Special issue on using large corpora: I
Information Retrieval and Categorisation using a Cell Assembly Network
Neural Computing and Applications
Neural network modeling of memory deterioration in alzheimer's disease
Neural Computation
Hebbian learning of context in recurrent neural networks
Neural Computation
Journal of Cognitive Neuroscience
Conflict resolution and learning probability matching in a neural cell-assembly architecture
Cognitive Systems Research
Hi-index | 0.00 |
Highly recurrent neural networks can learn reverberating circuits called Cell Assemblies (CAs). These networks can be used to categorize input, and this paper explores the ability of CAs to learn hierarchical categories. A simulator, based on spiking fatiguing leaky integrators, is presented with instances of base categories. Learning is done using a compensatory Hebbian learning rule. The model takes advantage of overlapping CAs where neurons may participate in more than one CA. Using the unsupervised compensatory learning rule, the networks learn a hierarchy of categories that correctly categorize 97% of the basic level presentations of the input in our test. It categorizes 100% of the super-categories correctly. A larger hierarchy is learned that correctly categorizes 100% of base categories, and 89% of super-categories. It is also shown how novel subcategories gain default information from their super-category. These simulations show that networks containing CAs can be used to learn hierarchical categories. The network then can successfully categorize novel inputs.