Elements of information theory
Elements of information theory
An Information-Theoretic Approach to Neural Computing
An Information-Theoretic Approach to Neural Computing
Using Hippocampal `Plane Cells' for Navigation, Exploiting Phase Coding
Advances in Neural Information Processing Systems 5, [NIPS Conference]
On embedding synfire chains in a balanced network
Neural Computation
Optimizing hierarchical temporal memory for multivariable time series
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
An evolutionary network model of epileptic phenomena
Neurocomputing
Low cost remote gaze gesture recognition in real time
Applied Soft Computing
Hi-index | 0.00 |
The analysis of an optimal neural system that maps stimuli into unique sequences of activations of fundamental atoms or functional clusters (FCs) is carried out. We say that it is perfect because the system maps with an injective function every stimulus in minimum time with the least number of FCs, such that every FC is activated only once. The neural system has the possibility to sustain several sequences in parallel. In this framework, we study the capacity achievable by the system, minimal completion time and complexity in terms of the number of parallel sequences. We show that the maximum capacity of the system is achieved without using parallel sequences at the expense of long completion times. However, when the capacity value is fixed, the largest possible number of parallel sequences is optimal because it requires short completion times. The complexity measure adds to important points: (i) the largest complexity of the system is achieved without parallel sequences, and (ii) the capacity estimation is a good estimation of the complexity of the system.