Neural associative memory for brain modeling and information retrieval
Information Processing Letters - Special issue on applications of spiking neural networks
Synaptic Dynamics in Analog VLSI
Neural Computation
Mathematical-model-based design of silicon burst neurons
Neurocomputing
EMBRACE: emulating biologically-inspired architectures on hardware
NN'08 Proceedings of the 9th WSEAS International Conference on Neural Networks
Memory capacities for synaptic and structural plasticity
Neural Computation
Neural associative memory for brain modeling and information retrieval
Information Processing Letters - Special issue on applications of spiking neural networks
Workload-aware neuromorphic design of low-power supply voltage controller
Proceedings of the 16th ACM/IEEE international symposium on Low power electronics and design
A silicon synapse based on a charge transfer device for spiking neural network application
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part III
A time multiplexing architecture for inter-neuron communications
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Letters: Dynamics of VLSI analog decoupled neurons
Neurocomputing
Analog VLSI implementation of adaptive synapses in pulsed neural networks
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Associative Learning of Integrate-and-Fire Neurons with Memristor-Based Synapses
Neural Processing Letters
Hi-index | 0.00 |
Electronic neuromorphic devices with on-chip, on-line learning should be able to modify quickly the synaptic couplings to acquire information about new patterns to be stored (synaptic plasticity) and, at the same time, preserve this information on very long time scales (synaptic stability). Here, we illustrate the electronic implementation of a simple solution to this stability-plasticity problem, recently proposed and studied in various contexts. It is based on the observation that reducing the analog depth of the synapses to the extreme (bistable synapses) does not necessarily disrupt the performance of the device as an associative memory, provided that 1) the number of neurons is large enough; 2) the transitions between stable synaptic states are stochastic; and 3) learning is slow. The drastic reduction of the analog depth of the synaptic variable also makes this solution appealing from the point of view of electronic implementation and offers a simple methodological alternative to the technological solution based on floating gates. We describe the full custom analog very large-scale integration (VLSI) realization of a small network of integrate-and-fire neurons connected by bistable deterministic plastic synapses which can implement the idea of stochastic learning. In the absence of stimuli, the memory is preserved indefinitely. During the stimulation the synapse undergoes quick temporary changes through the activities of the pre- and postsynaptic neurons; those changes stochastically result in a long-term modification of the synaptic efficacy. The intentionally disordered pattern of connectivity allows the system to generate a randomness suited to drive the stochastic selection mechanism. We check by a suitable stimulation protocol that the stochastic synaptic plasticity produces the expected pattern of potentiation and depression in the electronic network.