Certain aspects of the anatomy and physiology of the cerebral cortex
Parallel distributed processing
Learning invariance from transformation sequences
Neural Computation
Proceedings of the first international conference on simulation of adaptive behavior on From animals to animats
Neural computation and self-organizing maps: an introduction
Neural computation and self-organizing maps: an introduction
Original Contribution: CALM: Categorizing and learning module
Neural Networks
Computational constraints on higher neural representations
Computational neuroscience
What is the goal of sensory coding?
Neural Computation
Behaviorist intelligence and the scaling problem
Artificial Intelligence
Robust reasoning: integrating rule-based and similarity-based reasoning
Artificial Intelligence
Representation of visual features of objects in the inferotemporal cortex
Neural Networks - 1996 Special issue: four major hypotheses in neuroscience
Similarity, connectionism, and the problem of representation in vision
Neural Computation
Toward a biophysically plausible bidirectional Hebbian rule
Neural Computation
The handbook of brain theory and neural networks
Localized versus distributed representations
The handbook of brain theory and neural networks
Sparse coding in the primate cortex
The handbook of brain theory and neural networks
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain
Preintegration lateral inhibition enhances unsupervised learning
Neural Computation
Neural Computation
Learning Viewpoint Invariant Perceptual Representations from Cluttered Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning Image Components for Object Recognition
The Journal of Machine Learning Research
Hi-index | 0.00 |
A long running debate has concerned the question of whether neural representations are encoded using a distributed or a local coding scheme. In both schemes individual neurons respond to certain specific patterns of pre-synaptic activity. Hence, rather than being dichotomous, both coding schemes are based on the same representational mechanism. We argue that a population of neurons needs to be capable of learning both local and distributed representations, as appropriate to the task, and should be capable of generating both local and distributed codes in response to different stimuli. Many neural network algorithms, which are often employed as models of cognitive processes, fail to meet all these requirements. In contrast, we present a neural network architecture which enables a single algorithm to efficiently learn, and respond using, both types of coding scheme.