Communications of the ACM
Model selection
Kendall's advanced theory of statistics
Kendall's advanced theory of statistics
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
On the sample complexity of pac-learning using random and chosen examples
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Neural networks and the bias/variance dilemma
Neural Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Active Learning Using Arbitrary Binary Valued Queries
Machine Learning
Approximation and Estimation Bounds for Artificial Neural Networks
Machine Learning - Special issue on computational learning theory
Improving Generalization with Active Learning
Machine Learning - Special issue on structured connectionist systems
Towards robust model selection using estimation and approximation error bounds
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
Neural network exploration using optimal experiment design
Neural Networks
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Incremental Learning With Sample Queries
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Model Selection and Error Estimation
Machine Learning
Explanation-Based Generalization: A Unifying View
Machine Learning
Machine Learning
Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Concept learning using complexity regularization
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Consider the pattern recognition problem of learning multicategory classification from a labeled sample, for instance, the problem of learning character recognition where a category corresponds to an alphanumeric letter. The classical theory of pattern recognition assumes labeled examples appear according to the unknown underlying pattern-class conditional probability distributions where the pattern classes are picked randomly according to their a priori probabilities. In this paper we pose the following question: Can the learning accuracy be improved if labeled examples are independently randomly drawn according to the underlying class conditional probability distributions but the pattern classes are chosen not necessarily according to their a priori probabilities? We answer this in the affirmative by showing that there exists a tuning of the sub-sample proportions which minimizes a loss criterion. The tuning is relative to the intrinsic complexity of the Bayes-classifier. As this complexity depends on the underlying probability distributions which are assumed to be unknown, we provide an algorithm which learns the proportions in an on-line manner utilizing sample querying which asymptotically minimizes the criterion. In practice, this algorithm may be used to boost the performance of existing learning classification algorithms by apportioning better sub-sample proportions.