Induction of finite-state languages using second-order recurrent networks
Neural Computation
Discrete neural computation: a theoretical foundation
Discrete neural computation: a theoretical foundation
Pulsed neural networks
Populations of spiking neurons
Pulsed neural networks
Spikes: exploring the neural code
Spikes: exploring the neural code
Methods in Neuronal Modeling: From Ions to Networks
Methods in Neuronal Modeling: From Ions to Networks
Switching and Finite Automata Theory: Computer Science Series
Switching and Finite Automata Theory: Computer Science Series
Neural Systems as Nonlinear Filters
Neural Computation
On the Computational Power of Winner-Take-All
Neural Computation
Biophysics of Computation: Information Processing in Single Neurons (Computational Neuroscience Series)
Lower bounds for the computational power of networks of spiking neurons
Neural Computation
On the computational power of circuits of spiking neurons
Journal of Computer and System Sciences
Learning Beyond Finite Memory in Recurrent Networks of Spiking Neurons
Neural Computation
State-dependent computation using coupled recurrent networks
Neural Computation
Condition monitoring of the cutting process using a self-organizing spiking neural network map
Journal of Intelligent Manufacturing
Learning beyond finite memory in recurrent networks of spiking neurons
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
WCCI'12 Proceedings of the 2012 World Congress conference on Advances in Computational Intelligence
Hi-index | 0.00 |
We discuss in this short survey article some current mathematical models from neurophysiology for the computational units of biological neural systems: neurons and synapses. These models are contrasted with the computational units of common artificial neural network models, which reflect the state of knowledge in neurophysiology 50 years ago. We discuss the problem of carrying out computations in circuits consisting of biologically realistic computational units, focusing on the biologically particularly relevant case of computations on time series. Finite state machines are frequently used in computer science as models for computations on time series. One may argue that these models provide a reasonable common conceptual basis for analyzing computations in computers and biological neural systems, although the emphasis in biological neural systems is shifted more towards asynchronous computation on analog time series. In the second half of this article some new computer experiments and theoretical results are discussed, which address the question whether a biological neural system can, in principle, learn to behave like a given simple finite state machine.