CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
The electrical engineering handbook
The electrical engineering handbook
Analog versus digital: extrapolating from electronics to neurobiology
Neural Computation
Information Theory and the Brain
Information Theory and the Brain
Hi-index | 0.00 |
A model of information transmission across a neuron is delineated in terms of source (stimulus)-encoder-channel-decoder-behaviour (response). From cybernetic analysis of experimental data, we perform frequency/time domain and stability analyses and obtain the Bode, Nichols and Nyquist plots, Root locus plane, transfer function and response equation, all confirmed by data. We consider a new paradigm of information theory based on non-equilibrium dynamics of fluc-tuation, organization and information (Nicolis-Prigogine), that is the counterpart of Shannon-Boltzmann approach to information-entropy based on equilibrial dyna-mics. The Prigogine theorem of minimum entropy production and Rosen's prin-ciple of optimum design were observed to characterize neural transmission in a particular test neuron operating near optimal sensitivity regime. Using Nyquist theorem and generalized temperature concept, we compute a non-equilibrial entropy production and neurodynamic temperature equivalent during neural information processing. A trans-information/temperature plot implies an order-disorder Bose transition and zero neurodynamic entropy (near 00N) as informational analog of third law of thermodynamics (near 00K). Neural applications are explored.