Computation at the edge of chaos: phase transitions and emergent computation
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Chaos in discrete dynamical systems: a visual introduction in 2 dimensions
Chaos in discrete dynamical systems: a visual introduction in 2 dimensions
Chaos and Neural Network Learning. Some Observations
Neural Processing Letters
Chaotic neural networks with reinforced self-feedbacks and its application to N-Queen problem
Mathematics and Computers in Simulation
Chaos and Time-Series Analysis
Chaos and Time-Series Analysis
Orthogonal Frequency Division Multiplexing for Wireless Communications
Orthogonal Frequency Division Multiplexing for Wireless Communications
Chaotic cellular neural networks with negative self-feedback
ICAISC'06 Proceedings of the 8th international conference on Artificial Intelligence and Soft Computing
IEEE Journal on Selected Areas in Communications
Hi-index | 0.00 |
Widely accepted neural firing and synaptic potentiation rules specify a cross-dependence of the two processes, which, evolving on different timescales, have been separated for analytic purposes, concealing essential dynamics. Here, the morphology of the firing rates process, modulated by synaptic potentiation, is shown to be described by a discrete iteration map in the form of a thresholded polynomial. Given initial synaptic weights, a firing activity is triggered by conductance. Elementary dynamic modes are defined by fixed points, cycles, and saddles of the map, building blocks of the underlying firing code. Showing parameter-dependent multiplicity of real polynomial roots, the map is proved to be noninvertible. The incidence of chaos is then implied by the parameter-dependent existence of snap-back repellers. The highly patterned geometric and statistical structures of the associated chaotic attractors suggest that these attractors are an integral part of the neural code. It further suggests the chaotic attractor as a natural mechanism for statistical encoding and temporal multiplexing of neural information. The analytic findings are supported by simulation.