Training a single sigmoidal neuron is hard
Neural Computation
Neural Computation
Minimizing the Quadratic Training Error of a Sigmoid Neuron Is Hard
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
Loading Deep Networks Is Hard: The Pyramidal Case
Neural Computation
Hi-index | 0.00 |
The loading problem formulated by J. S. Judd seems to be arelevant model for supervised connectionist learning of thefeedforward networks from the complexity point of view. It is knownthat loading general network architectures is NP-complete(intractable) when the (training) tasks are also general. Manystrong restrictions on architectural design and/or on the tasks donot help to avoid the intractability of loading. Judd concentratedon the width expanding architectures with constant depth and founda polynomial time algorithm for loading restricted shallowarchitectures. He suppressed the effect of depth on loadingcomplexity and left as an open prototypical computational problemthe loading of easy regular triangular architectures that mightcapture the crux of depth difficulties. We have proven this problemto be NP-complete. This result does not give much hope for theexistence of an efficient algorithm for loading deep networks.