Mapping part-whole hierarchies into connectionist networks
Artificial Intelligence - On connectionist symbol processing
Artificial Intelligence - On connectionist symbol processing
The syntactic process
On the emergence of rules in neural networks
Neural Computation
Holographic Reduced Representation: Distributed Representation for Cognitive Structures
Holographic Reduced Representation: Distributed Representation for Cognitive Structures
Automatic extraction of subcategorization from corpora
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
Automatic retrieval and clustering of similar words
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
Neural Networks Letter: Cogent confabulation
Neural Networks
Memory Capacity of Balanced Networks
Neural Computation
Grammar Processing Outside the Focus of Attention: an MEG Study
Journal of Cognitive Neuroscience
A neuronal model of the language cortex
Neurocomputing
Journal of Cognitive Neuroscience
Journal of Cognitive Neuroscience
Syntactically based sentence processing classes: Evidence from event-related brain potentials
Journal of Cognitive Neuroscience
Optimal plasticity from matrix memories: What goes up must come down
Neural Computation
A neural mechanism for human language processing
Neurocomputing
A sentence generation network that learns surface and abstract syntactic structures
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
Intelligence and embodiment: A statistical mechanics approach
Neural Networks
Hi-index | 0.00 |
In neural network research on language, the existence of discrete combinatorial rule representations is commonly denied. Combinatorial capacity of networks and brains is rather attributed to probability mapping and pattern overlay. Here, we demonstrate that networks incorporating relevant features of neuroanatomical connectivity and neuronal function give rise to discrete neuronal circuits that store combinatorial information and exhibit a function similar to elementary rules of grammar. Key properties of these networks are rich auto- and hetero-associative connectivity, availability of sequence detectors similar to those found in a range of animals, and unsupervised Hebbian learning. Input of specific word sequences establishes sequence detectors in the network, and substitutions of words and larger string segments from one syntactic category, occurring in the context of elements of a second syntactic class, lead to binding between them into neuronal assemblies. Critically, these newly formed aggregates of sequence detectors now respond in a discrete generalizing fashion when members of specific substitution classes of string elements are combined with each other. The discrete combinatorial neuronal assemblies (DCNAs) even respond in the same way to learned strings and to word sequences that never appeared in the input but conform to a rule. We also show how combinatorial information interacts with information about functional and anatomical properties of the brain in the emergence of discrete neuronal circuits that may implement rules and discuss the model in the wider context of brain mechanism for syntax and grammar. Implications for the evolution of human language are discussed in closing.