Neural network design and the complexity of learning
Neural network design and the complexity of learning
Neural Computation
Robust trainability of single neurons
Journal of Computer and System Sciences
Back-propagation is not efficient
Neural Networks
On the infeasibility of training neural networks with small squared errors
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Hardness results for neural network approximation problems
Theoretical Computer Science
Training a single sigmoidal neuron is hard
Neural Computation
Hi-index | 0.00 |
The question of whether it is possible to load deep neural network architectures efficiently is examined by considering the class of pyramidal architectures. This class allows only a low interaction of the nodes. Still, the loading problem is found to be NP-complete. This provides evidence that depth alone is a factor accounting for loading hardness.