Analog VLSI and neural systems
Analog VLSI and neural systems
Multilayer feedforward networks are universal approximators
Neural Networks
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Structure theorems for nonlinear systems
Multidimensional Systems and Signal Processing
Vapnik-Chervonenkis dimension of recurent neural networks
Discrete Applied Mathematics - Special issue: Vapnik-Chervonenkis dimension
Neural networks with dynamic synapses
Neural Computation
A learning result for continuous-time recurrent neural networks
Systems & Control Letters - Special issue: learning theory
Pulsed neural networks
Computing and learning with dynamic synapses
Pulsed neural networks
Dynamic stochastic synapses as computational units
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Reaching: coding in motor cortex
The handbook of brain theory and neural networks
Spikes: exploring the neural code
Spikes: exploring the neural code
A Model for Fast Analog Computation Based on Unreliable Synapses
Neural Computation
On the Computational Power of Winner-Take-All
Neural Computation
Biophysics of Computation: Information Processing in Single Neurons (Computational Neuroscience Series)
The Volterra and Wiener Theories of Nonlinear Systems
The Volterra and Wiener Theories of Nonlinear Systems
Sample complexity for learning recurrent perceptron mappings
IEEE Transactions on Information Theory
Spiking neurons and the induction of finite state machines
Theoretical Computer Science - Natural computing
On the computational power of circuits of spiking neurons
Journal of Computer and System Sciences
Analog VLSI circuits for short-term dynamic synapses
EURASIP Journal on Applied Signal Processing
Perceptive, non-linear speech processing and spiking neural networks
Nonlinear Speech Modeling and Applications
Temporal finite-state machines: a novel framework for the general class of dynamic networks
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Spatio-temporal spike pattern classification in neuromorphic systems
Living Machines'13 Proceedings of the Second international conference on Biomimetic and Biohybrid Systems
Hi-index | 0.00 |
Experimental data show that biological synapses behave quite differently from the symbolic synapses in all common artificial neural network models. Biological synapses are dynamic; their “weight” changes on a short timescale by several hundred percent in dependence of the past input to the synapse. In this article we address the question how this inherent synaptic dynamics (which should not be confused with long term learning) affects the computational power of a neural network. In particular, we analyze computations on temporal and spatiotemporal patterns, and we give a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses. It turns out that even with just a single hidden layer, such networks can approximate a very rich class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Our characterization result provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems.