Communications of the ACM
A new polynomial-time algorithm for linear programming
Combinatorica
How to construct random functions
Journal of the ACM (JACM)
On the learnability of Boolean formulae
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
On the complexity of loading shallow neural networks
Journal of Complexity - Special Issue on Neural Computation
On the capabilities of multilayer perceptrons
Journal of Complexity - Special Issue on Neural Computation
Complete representations for learning from examples
Complexity in information theory
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
What size net gives valid generalization?
Neural Computation
Training a 3-node neural network is NP-complete
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Relations Among Complexity Measures
Journal of the ACM (JACM)
Hi-index | 0.00 |
Judd (1988) and Blum and Rivest (1988) have recently proved that the loading problem for neural networks is NP complete. This makes it very unlikely that any algorithm like backpropagation which varies weights on a network of fixed size and topology will be able to learn in polynomial time. However, Valiant has recently proposed a learning protocol (Valiant 1984), which allows one to sensibly consider generalization by learning algorithms with the freedom to add neurons and synapses, as well as simply adjusting weights. Within this context, standard circuit complexity arguments show that learning algorithms with such freedom can solve in polynomial time any learning problem that can be solved in polynomial time by any algorithm whatever. In this sense, neural nets are universal learners, capable of learning any learnable class of concepts.