Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Equivalence of models for polynomial learnability
Information and Computation
Computational learning theory: an introduction
Computational learning theory: an introduction
SIAM Journal on Computing
Efficient noise-tolerant learning from statistical queries
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
On the learnability of discrete distributions
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Noise-tolerant distribution-free learning of general geometric concepts
Journal of the ACM (JACM)
Exact Learning of Discretized Geometric Concepts
SIAM Journal on Computing
Estimating a mixture of two product distributions
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Evolutionary Trees can be Learned in Polynomial Time in the Two-State General Markov Model
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Learning linear transformations
FOCS '96 Proceedings of the 37th Annual Symposium on Foundations of Computer Science
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Learning disjunction of conjunctions
IJCAI'85 Proceedings of the 9th international joint conference on Artificial intelligence - Volume 1
Information, Divergence and Risk for Binary Experiments
The Journal of Machine Learning Research
PAC-learnability of probabilistic deterministic finite state automata in terms of variation distance
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Discriminative learning can succeed where generative learning fails
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Hi-index | 0.00 |
In this paper we study a new restriction of the PAC learning framework, in whicheac hlab el class is handled by an unsupervised learner that aims to fit an appropriate probability distribution to its own data. A hypothesis is derived by choosing, for any unlabeled instance, the label whose distribution assigns it the higher likelihood. The motivation for the new learning setting is that the general approach of fitting separate distributions to eachlab el class, is often used in practice for classification problems. The set of probability distributions that is obtained is more useful than a collection of decision boundaries. A question that arises, however, is whether it is ever more tractable (in terms of computational complexity or sample-size required) to find a simple decision boundary than to divide the problem up into separate unsupervised learning problems and find appropriate distributions. Within the framework, we give algorithms for learning various simple geometric concept classes. In the boolean domain we show how to learn parity functions, and functions having a constant upper bound on the number of relevant attributes. These results distinguish the new setting from various other well-known restrictions of PAC-learning. We give an algorithm for learning monomials over input vectors generated by an unknown product distribution. The main open problem is whether monomials (or any other concept class) distinguish learnability in this framework from standard PAC-learnability.