The complexity of Boolean functions
The complexity of Boolean functions
Winner-take-all networks of O(N) complexity
Advances in neural information processing systems 1
Discrete neural computation: a theoretical foundation
Discrete neural computation: a theoretical foundation
Dynamics of a winner-take-all neural network
Neural Networks
Models of Computation: Exploring the Power of Computing
Models of Computation: Exploring the Power of Computing
The Handbook of Brain Theory and Neural Networks
The Handbook of Brain Theory and Neural Networks
Modeling Selective Attention Using a Neuromorphic Analog VLSI Device
Neural Computation
K-winners-take-all circuit with O(N) complexity
IEEE Transactions on Neural Networks
Neural circuits for pattern recognition with small total wire length
Theoretical Computer Science - Natural computing
Spiking neurons and the induction of finite state machines
Theoretical Computer Science - Natural computing
Selectively grouping neurons in recurrent networks of lateral inhibition
Neural Computation
Efficient Pattern Discrimination with Inhibitory WTA Nets
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Constraint Classification: A New Approach to Multiclass Classification
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Normalization in Support Vector Machines
Proceedings of the 23rd DAGM-Symposium on Pattern Recognition
On the Computational Power of Max-Min Propagation Neural Networks
Neural Processing Letters
Discrimination networks for maximum selection
Neural Networks
On the computational power of circuits of spiking neurons
Journal of Computer and System Sciences
Sparse distributed memory using N-of-M codes
Neural Networks
Neural Systems as Nonlinear Filters
Neural Computation
Neural Networks - 2006 Special issue: Neurobiology of decision making
Multi-class pattern classification using neural networks
Pattern Recognition
State-dependent computation using coupled recurrent networks
Neural Computation
Competitive stdp-based spike pattern learning
Neural Computation
Flexible hardware-based stereo matching
EURASIP Journal on Embedded Systems - Special issue on design and architectures for signal and image processing
Computation with spikes in a winner-take-all network
Neural Computation
A biologically plausible winner-takes-all architecture
ICIC'09 Proceedings of the Intelligent computing 5th international conference on Emerging intelligent computing technology and applications
Reward-modulated hebbian learning of decision making
Neural Computation
IEEE Transactions on Neural Networks
Accurate hardware-based stereo vision
Computer Vision and Image Understanding
Of managers, ideas and jesters, and the role of information technology
The Journal of Strategic Information Systems
IEEE Transactions on Neural Networks
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part I
Competition through selective inhibitory synchrony
Neural Computation
Collective stability of networks of winner-take-all circuits
Neural Computation
Enhancing directed binary trees for multi-class classification
Information Sciences: an International Journal
Hi-index | 0.00 |
This article initiates a rigorous theoretical analysis of the computational power of circuits that employ modules for computing winner-take-all. Computational models that involve competitive stages have so far been neglected in computational complexity theory, although they are widely used in computational brain models, artificial neural networks, and analog VLSI. Our theoretical analysis shows that winner-take-all is a surprisingly powerful computational module in comparison with threshold gates (also referred to as McCulloch-Pitts neurons) and sigmoidal gates. We prove an optimal quadratic lower bound for computing winner-take-all in any feedforward circuit consisting of threshold gates. In addition we show that arbitrary continuous functions can be approximated by circuits employing a single soft winner-take-all gate as their only nonlinear operation. Our theoretical analysis also provides answers to two basic questions raised by neurophysiologists in view of the well-known asymmetry between excitatory and inhibitory connections in cortical circuits: how much computational power of neural networks is lost if only positive weights are employed in weighted sums and how much adaptive capability is lost if only the positive weights are subject to plasticity.