Experience-induced neural circuits that achieve high capacity
Neural Computation
The hippocampus as a stable memory allocator for cortex
Neural Computation
Hi-index | 0.00 |
Synchrony-driven recruitment learning addresses the question of how arbitrary concepts, represented by synchronously active ensembles, may be acquired within a randomly connected static graph of neuron-like elements. Recruitment learning in hierarchies is an inherently unstable process. This paper presents conditions on parameters for a feedforward network to ensure stable recruitment hierarchies. The parameter analysis is conducted by using a stochastic population approach to model a spiking neural network. The resulting network converges to activate a desired number of units at each stage of the hierarchy. The original recruitment method is modified first by increasing feedforward connection density for ensuring sufficient activation, then by incorporating temporally distributed feedforward delays for separating inputs temporally, and finally by limiting excess activation via lateral inhibition. The task of activating a desired number of units from a population is performed similarly to a temporal k-winners-take-all network.