A random polynomial-time algorithm for approximating the volume of convex bodies
Journal of the ACM (JACM)
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension
Machine Learning - Special issue on computational learning theory
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
A PAC analysis of a Bayesian estimator
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Active learning with committees for text categorization
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Large-scale text categorization by batch mode active learning
Proceedings of the 15th international conference on World Wide Web
Batch mode active learning and its application to medical image classification
ICML '06 Proceedings of the 23rd international conference on Machine learning
Learning the unified kernel machines for classification
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Active sampling for multiple output identification
Machine Learning
Active Learning by Spherical Subdivision
The Journal of Machine Learning Research
Unsupervised active learning based on hierarchical graph-theoretic clustering
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Active sampling for multiple output identification
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Analysis of perceptron-based active learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
The Journal of Machine Learning Research
Active learning via perfect selective classification
The Journal of Machine Learning Research
Hi-index | 5.23 |
A long-standing goal in the realm of Machine Learning is to minimize sample-complexity, i.e. to reduce as much as possible the number of examples used in the course of learning. The Active Learning paradigm is one such method aimed at achieving this goal by transforming the learner from a passive participant in the information gathering process to an active one. Vaguely speaking, the learner tries to minimize the number of labeled instances used in the course of learning, relaying also on unlabelled instances in order to acquire the needed information whenever possible. The reasoning comes from many real-life problems where the teacher's activity is an expensive resource (e.g. text categorization, part of speech tagging). The Query By Committee (QBC) (Seung et al., Query by committee, Proceedings of the Fifth Workshop on Computational Learning theory, Morgan Kaufman, San Mateo, CA, 1992, pp. 287-294) is an Active Learning algorithm acting in the Bayesian model of concept learning (Haussler et al., Mach. Learning 14 (1994) 83), i.e. it assumes that the concept to be learned is chosen according to some fixed and known distribution. Trying to apply the QBC algorithm for learning the class of linear separators, one faces the problem of implementing the mechanism of sampling hypotheses (the Gibbs oracle). The major problem is computational-complexity, since the straightforward Monte Carlo method takes exponential time. In this paper we address the problems involved in the implementation of such a mechanism. We show how to convert them to questions about sampling from convex bodies or approximating the volume of such bodies. Similar problems have recently been solved in the field of computational geometry based on random walks. These techniques enable us to device efficient implementations of the QBC algorithm. We also give few improvements and corrections to the QBC algorithm, the most important one is dropping the Bayes assumption when the concept classes possess a sort of symmetry property (which holds for linear separators). We draw attention to a useful geometric lemma which bounds the maximal radius of a ball contained in a convex body. Finally, this paper exhibits a connection between random walks and certain Machine Learning notions such as ε-net and support vector machines.