Linear function neurons: Structure and training
Biological Cybernetics
Computational limitations on learning from examples
Journal of the ACM (JACM)
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Neural network design and the complexity of learning
Neural network design and the complexity of learning
Machine models and simulations
Handbook of theoretical computer science (vol. A)
Machine learning: a theoretical approach
Machine learning: a theoretical approach
How fast can a threshold gate learn?
Proceedings of a workshop on Computational learning theory and natural learning systems (vol. 1) : constraints and prospects: constraints and prospects
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Robust trainability of single neurons
Journal of Computer and System Sciences
Learning binary perceptrons perfectly efficiently
Journal of Computer and System Sciences
Covering cubes by random half cubes, with applications to binary neural networks
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
A Dichotomy Theorem for Learning Quantified Boolean Formulas
Machine Learning - Special issue: computational learning theory, COLT '97
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
ACSC '02 Proceedings of the twenty-fifth Australasian conference on Computer science - Volume 4
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Perceptron, Winnow, and PAC Learning
SIAM Journal on Computing
Training a single sigmoidal neuron is hard
Neural Computation
The complexity of satisfiability problems
STOC '78 Proceedings of the tenth annual ACM symposium on Theory of computing
Learnability of quantified formulas
Theoretical Computer Science
The complexity of minimal satisfiability problems
Information and Computation
Hi-index | 0.00 |
The computational complexity of learning from binary examples is investigated for linear threshold neurons. We introduce combinatorial measures that create classes of infinitely many learning problems with sample restrictions. We analyze how the complexity of these problems depends on the values for the measures. The results are established as dichotomy theorems showing that each problem is either NP-complete or solvable in polynomial time. In particular, we consider consistency and maximum consistency problems for neurons with binary weights, and maximum consistency problems for neurons with arbitrary weights. We determine for each problem class the dividing line between the NP-complete and polynomial-time solvable problems. Moreover, all efficiently solvable problems are shown to have constructive algorithms that require no more than linear time on a random access machine model. Similar dichotomies are exhibited for neurons with bounded threshold. The results demonstrate on the one hand that the consideration of sample constraints can lead to the discovery of new efficient algorithms for non-trivial learning problems. On the other hand, hard learning problems may remain intractable even for severely restricted samples