Automatic Pattern Recognition: A Study of the Probability of Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
What size net gives valid generalization?
Advances in neural information processing systems 1
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Generalization and parameter estimation in feedforward nets: some experiments
Advances in neural information processing systems 2
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Hi-index | 0.14 |
A new approach for estimating classification errors is presented. In the model, there are two types of classification error: empirical and generalization error. The first is the error observed over the training samples, and the second is the discrepancy between the error probability and empirical error. In this research, the Vapnik and Chervonenkis dimension (VCdim) is used as a measure for classifier complexity. Based on this complexity measure, an estimate for generalization error is developed. An optimal classifier design criterion (the generalized minimum empirical error criterion (GMEE)) is used. The GMEE criterion consists of two terms: the empirical and the estimate of generalization error. As an application, the criterion is used to design the optimal neural network classifier. A corollary to the Gamma optimality of neural-network-based classifiers is proven. Thus, the approach provides a theoretic foundation for the connectionist approach to optimal classifier design. Experimental results to validate this approach.