Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
Characterization of subthreshold MOS mismatch in transistors for VLSI systems
Journal of VLSI Signal Processing Systems - Joint special issue on Analog VLSI computation; also see Analog Integrated Circuits Signal Process., Vol. 6, No. 1
Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Statistically efficient estimation using population coding
Neural Computation
Minimax and Hamiltonian dynamics of excitatory-inhibitory networks
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Systematic Width-and-Length Dependent CMOS Transistor Mismatch Characterization and Simulation
Analog Integrated Circuits and Signal Processing - Special issue on low voltage/low power design
Introduction to Automata Theory, Languages and Computability
Introduction to Automata Theory, Languages and Computability
The computer and the brain
Spiking neurons and the induction of finite state machines
Theoretical Computer Science - Natural computing
Permitted and forbidden sets in symmetric threshold-linear networks
Neural Computation
An n log n algorithm for minimizing states in a finite automaton
An n log n algorithm for minimizing states in a finite automaton
Bayesian computation in recurrent neural circuits
Neural Computation
Computation: finite and infinite machines
Computation: finite and infinite machines
On the Computational Power of Winner-Take-All
Neural Computation
Active Coevolutionary Learning of Deterministic Finite Automata
The Journal of Machine Learning Research
JFLAP: An Interactive Formal Languages and Automata Package
JFLAP: An Interactive Formal Languages and Automata Package
Neural Computation
A winner-take-all mechanism based on presynaptic inhibition feedback
Neural Computation
On the computational power of Elman-style recurrent networks
IEEE Transactions on Neural Networks
A multifactor winner-take-all dynamics
Neural Computation
A systematic method for configuring vlsi networks of spiking neurons
Neural Computation
Competition through selective inhibitory synchrony
Neural Computation
Collective stability of networks of winner-take-all circuits
Neural Computation
Continuous real-world inputs can open up alternative accelerator designs
Proceedings of the 40th Annual International Symposium on Computer Architecture
Hi-index | 0.00 |
Although conditional branching between possible behavioral states is a hallmark of intelligent behavior, very little is known about the neuronal mechanisms that support this processing. In a step toward solving this problem, we demonstrate by theoretical analysis and simulation how networks of richly interconnected neurons, such as those observed in the superficial layers of the neocortex, can embed reliable, robust finite state machines. We show how a multistable neuronal network containing a number of states can be created very simply by coupling two recurrent networks whose synaptic weights have been configured for soft winner-take-all (sWTA) performance. These two sWTAs have simple, homogeneous, locally recurrent connectivity except for a small fraction of recurrent cross-connections between them, which are used to embed the required states. This coupling between the maps allows the network to continue to express the current state even after the input that elicited that state is withdrawn. In addition, a small number of transition neurons implement the necessary input-driven transitions between the embedded states. We provide simple rules to systematically design and construct neuronal state machines of this kind. The significance of our finding is that it offers a method whereby the cortex could construct networks supporting a broad range of sophisticated processing by applying only small specializations to the same generic neuronal circuit.