Artificial Intelligence - On connectionist symbol processing
Language acquisition from sparse input without error feedback
Neural Networks
Strong Semantic Systematicity from Hebbian Connectionist Learning
Minds and Machines
Distributed representations and nested compositional structure
Distributed representations and nested compositional structure
Second-order generalization in neural networks
Second-order generalization in neural networks
Two Apparent `Counterexamples' To Marcus: A Closer Look
Minds and Machines
Synchronous versus conjunctive binding: a false dichotomy?
Connection Science
Learning the systematic transformation of holographic reduced representations
Cognitive Systems Research
Advances in Artificial Intelligence - Special issue on artificial intelligence in neuroscience and systems biology: lessons learnt, open problems, and the road ahead
Hi-index | 0.00 |
Both Marcus (2001) and Jackendoff (2002) have emphasized the importance of finding credible explanations for the occurrence of variables within cognitive representations. Marcus, in particular, has argued that a prevailing form of connectionist modeling, eliminative connectionism, cannot adequately explain crucial forms of human generalization. Eliminative connectionism eschews the use of explicitly represented variables, and the latter, Marcus contends, play an essential role in the forms of generalization that he considers. Recently, van der Velde and de Kamps (2006) proposed a neural blackboard architecture, which they assert to have satisfied the variable representation needs that Marcus and Jackendoff identified. However, this letter argues that closely related variants of Marcus's generalization examples possess variable requirements that are incompatible with the van der Velde and de Kamps approach. Moreover, it is argued here that these newly proposed variants present a severe challenge not only for eliminative connectionism but for all network training methods that require iterative tuning of synaptic strengths. The letter focuses on generalization cases that necessitate either virtually instantaneous creation of variables or very rapid deployment of preexisting variables within highly novel contexts.