On the computational power of neural nets
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Finiteness results for sigmoidal “neural” networks
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Bounds for the computational power and learning complexity of analog neural nets
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Size-depth trade-offs for threshold circuits
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
On the computational power of depth 2 circuits with threshold and modulo gates
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Polynomial bounds for VC dimension of sigmoidal neural networks
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
Computational complexity of neural networks: a survey
Nordic Journal of Computing
Analog versus discrete neural networks
Neural Computation
MUSP'09 Proceedings of the 9th WSEAS international conference on Multimedia systems & signal processing
Designing neural networks for tackling hard classification problems
WSEAS TRANSACTIONS on SYSTEMS
A Note on a priori Estimations of Classification Circuit Complexity
Fundamenta Informaticae - Hardest Boolean Functions and O.B. Lupanov
Hi-index | 0.00 |
The power of constant depth circuits with sigmoid (i.e., smooth) threshold gates for computing Boolean functions is examined. It is shown that, for depth 2, constant size circuits of this type are strictly more powerful than constant size Boolean threshold circuits (i.e., circuits with Boolean threshold gates). On the other hand it turns out that, for any constant depth d, polynomial size sigmoid threshold circuits with polynomially bounded weights compute exactly the same Boolean functions as the corresponding circuits with Boolean threshold gates.