The complexity of Boolean functions
The complexity of Boolean functions
The complexity of finite functions
Handbook of theoretical computer science (vol. A)
On the computational power of sigmoid versus boolean threshold circuits (extended abstract)
SFCS '91 Proceedings of the 32nd annual symposium on Foundations of computer science
The complexity of the parity function in unbounded fan-in, unbounded depth circuits
Theoretical Computer Science
Computational limitations on training sigmoid neural networks
Information Processing Letters
Finiteness results for sigmoidal “neural” networks
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Bounds for the computational power and learning complexity of analog neural nets
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
The Power of Approximation: A Comparison of Activation Functions
Advances in Neural Information Processing Systems 5, [NIPS Conference]
The Connectionist Inductive Learning and Logic Programming System
Applied Intelligence
Neural circuits for pattern recognition with small total wire length
Theoretical Computer Science - Natural computing
Spiking neural controllers for pushing objects around
SAB'06 Proceedings of the 9th international conference on From Animals to Animats: simulation of Adaptive Behavior
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
We show that neural networks with three-times continuously differentiable activation functions are capable of computing a certain family of n-bit Boolean functions with two gates, whereas networks composed of binary threshold functions require at least â聞娄(log n) gates. Thus, for a large class of activation functions, analog neural networks can be more powerful than discrete neural networks, even when computing Boolean functions.