Introduction to the theory of neural computation
Introduction to the theory of neural computation
Elements of information theory
Elements of information theory
The role of constraints in Hebbian learning
Neural Computation
Learning in neural networks with material synapses
Neural Computation
Spiking Neuron Models: An Introduction
Spiking Neuron Models: An Introduction
Intrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning
Neural Computation
Spike-Driven Synaptic Plasticity: Theory, Simulation, VLSI Implementation
Neural Computation
Neural Computation
Hi-index | 0.00 |
We studied the hypothesis that synaptic dynamics is controlled by three basic principles: (1) synapses adapt their weights so that neurons can effectively transmit information, (2) homeostatic processes stabilize the mean firing rate of the postsynaptic neuron, and (3) weak synapses adapt more slowly than strong ones, while maintenance of strong synapses is costly. Our results show that a synaptic update rule derived from these principles shares features, with spike-timing-dependent plasticity, is sensitive to correlations in the input and is useful for synaptic memory. Moreover, input selectivity (sharply tuned receptive fields) of postsynaptic neurons develops only if stimuli with strong features are presented. Sharply tuned neurons can coexist with unselective ones, and the distribution of synaptic weights can be unimodal or bimodal. The formulation of synaptic dynamics through an optimality criterion provides a simple graphical argument for the stability of synapses, necessary for synaptic memory.