Communications of the ACM
Advances in neural information processing systems 2
Machine learning: a theoretical approach
Machine learning: a theoretical approach
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Neural networks and the bias/variance dilemma
Neural Computation
A differential theory of learning for efficient statistical pattern recognition
A differential theory of learning for efficient statistical pattern recognition
Hi-index | 0.00 |
We outline a differential theory of learning for statistical pattern classification. The theory is based on classification figure-of-merit (CFM) objective functions, described in [9]. We outline the proof that differential learning is efficient, requiring the least classifier complexity and the smallest training sample size necessary to achieve Bayesian (i.e., minimum error) discrimination. We conclude with a practical application of the theory in which a simple differentially trained linear neural network classifier discriminates handwritten digits of the AT&T DB1 database with a 1.3% error rate. This error rate is less than one half the best previous result for a linear classifier on this optical character recognition (OCR) task [1].