On the computational power of circuits of spiking neurons
Journal of Computer and System Sciences
CiE '07 Proceedings of the 3rd conference on Computability in Europe: Computation and Logic in the Real World
Synchrony State Generation in Artificial Neural Networks with Stochastic Synapses
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Temporal finite-state machines: a novel framework for the general class of dynamic networks
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
Liquid-state machines (LSM) represent a class of neural networks that are able to introduce multitasking by implicit representation of input information over the entire network components. How exactly the input information can be represented and how the computations are accomplished, stay however unresolved. In order to tackle this issue, we demonstrate how LSM can process different input information as a varying set of transiently stable states of collective activity. This is performed by adopting a relatively complex dynamic synaptic model. Some light is shed on the relevance of the usage of the developed framework to mimic complex cortical functions, e.g. content-addressable memory.