Learning in threshold networks
COLT '88 Proceedings of the first annual workshop on Computational learning theory
A statistical approach to learning and generalization in layered neural networks
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Recognizing hand-printed letters and digits using backpropagation learning
Neural Computation
Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Machine learning: a theoretical approach
Machine learning: a theoretical approach
Directed drift: a new linear threshold algorithm for learning binary weights on-line
Journal of Computer and System Sciences
DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FORNEURAL NET AND OTHER LEARNING APPLICATIONS
DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FORNEURAL NET AND OTHER LEARNING APPLICATIONS
Hi-index | 0.00 |
We address the issue of the precision required by an N-input threshold element in order to implement a linearly separable mapping. In distinction with previous work we require only the ability to correctly implement the mapping of P randomly chosen training examples, as opposed to the complete boolean mapping. Our results are obtained within the statistical mechanics approach and are thus average case results as opposed to the worst case analyses in the computational learning theory literature. We show that as long as the fraction P/N is finite, then with probability close to 1 as N → ∞ a finite number of bits suffice to implement the mapping. This should be compared to the worst case analysis which requires O(N log N) bits. We also calculate the ability of the constrained network to predict novel examples and compare their predictions to those of an unconstrained network. Finally, we address the issue of the performance of the finite-precision network in the face of noisy training examples.