Communications of the ACM
Computational limitations on learning from examples
Journal of the ACM (JACM)
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Neural network design and the complexity of learning
Neural network design and the complexity of learning
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Robust trainability of single neurons
Journal of Computer and System Sciences
On the geometric separability of Boolean functions
Discrete Applied Mathematics
Networks of spiking neurons: the third generation of neural network models
Transactions of the Society for Computer Simulation International - Special issue: simulation methodology in transportation systems
Pulsed neural networks
Complexity theoretic hardness results for query learning
Computational Complexity
On the complexity of learning for spiking neurons with temporal coding
Information and Computation
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Theoretical Advances in Neural Computation and Learning
Theoretical Advances in Neural Computation and Learning
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
On computing Boolean functions by a spiking neuron
Annals of Mathematics and Artificial Intelligence
Training a single sigmoidal neuron is hard
Neural Computation
The complexity of theorem-proving procedures
STOC '71 Proceedings of the third annual ACM symposium on Theory of computing
Classification of linearly nonseparable patterns by linear threshold elements
IEEE Transactions on Neural Networks
A distributed and multithreaded neural event driven simulation framework
PDCN'06 Proceedings of the 24th IASTED international conference on Parallel and distributed computing and networks
Delay learning and polychronization for reservoir computing
Neurocomputing
Hi-index | 0.00 |
We study the computational complexity of training a single spiking neuron N with binary coded inputs and output that, in addition to adaptive weights and a threshold, has adjustable synaptic delays. A synchronization technique is introduced so that the results concerning the nonlearnability of spiking neurons with binary delays are generalized to arbitrary real-valued delays. In particular, the consistency problem for N with programmable weights, a threshold, and delays, and its approximation version are proven to be NP-complete. It follows that the spiking neurons with arbitrary synaptic delays are not properly PAC learnable and do not allow robust learning unless RP = NP. In addition, the representation problem for N, a question whether an n-variable Boolean function given in DNF (or as a disjunction of O(n) threshold gates) can be computed by a spiking neuron, is shown to be coNP-hard.