Introduction to the theory of neural computation
Introduction to the theory of neural computation
Neural networks with dynamic synapses
Neural Computation
An Introduction to the Modeling of Neural Networks
An Introduction to the Modeling of Neural Networks
Associative memory with dynamic synapses
Neural Computation
Switching Dynamics of Neural Systems in the Presence of Multiplicative Colored Noise
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Development of Neural Network Structure with Biological Mechanisms
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Hi-index | 0.00 |
In this work, we study, analytically and employing Monte Carlo simulations, the influence of the competition between several activity-dependent synaptic processes, such as short-term synaptic facilitation and depression, on the maximum memory storage capacity in a neural network. In contrast to the case of synaptic depression, which drastically reduces the capacity of the network to store and retrieve “static” activity patterns, synaptic facilitation enhances the storage capacity in different contexts. In particular, we found optimal values of the relevant synaptic parameters (such as the neurotransmitter release probability or the characteristic facilitation time constant) for which the storage capacity can be maximal and similar to the one obtained with static synapses, that is, without activity-dependent processes. We conclude that depressing synapses with a certain level of facilitation allow recovering the good retrieval properties of networks with static synapses while maintaining the nonlinear characteristics of dynamic synapses, convenient for information processing and coding.