Communications of the ACM
Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension
STOC '86 Proceedings of the eighteenth annual ACM symposium on Theory of computing
On the learnability of Boolean formulae
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
Quantifying inductive bias: AI learning algorithms and Valiant's learning framework
Artificial Intelligence
Computational limitations on learning from examples
Journal of the ACM (JACM)
Crytographic limitations on learning Boolean formulae and finite automata
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
The Strength of Weak Learnability
Machine Learning
A Necessary Condition for Learning from Positive Examples
Machine Learning
Equivalence of models for polynomial learnability
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Learnability by fixed distributions
COLT '88 Proceedings of the first annual workshop on Computational learning theory
A general lower bound on the number of examples needed for learning
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Machine Learning
On the complexity of learning for a spiking neuron (extended abstract)
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Learning with maximum-entropy distributions
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Testing problems with sub-learning sample complexity
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Learning with Maximum-Entropy Distributions
Machine Learning
A geometric approach to leveraging weak learners
Theoretical Computer Science
Learning Sub-classes of Monotone DNF on the Uniform Distribution
ALT '98 Proceedings of the 9th International Conference on Algorithmic Learning Theory
Hybrid Learning Schemes for Multimedia Information Retrieval
PCM '02 Proceedings of the Third IEEE Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
On Learning Monotone DNF under Product Distributions
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
An Architecture of a Web-Based Collaborative Image Search Engine
On the Move to Meaningful Internet Systems, 2002 - DOA/CoopIS/ODBASE 2002 Confederated International Conferences DOA, CoopIS and ODBASE 2002
On Learning Monotone Boolean Functions under the Uniform Distribution
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
MEGA---the maximizing expected generalization algorithm for learning complex query concepts
ACM Transactions on Information Systems (TOIS)
On learning monotone DNF under product distributions
Information and Computation
On learning monotone boolean functions under the uniform distribution
Theoretical Computer Science - Algorithmic learning theory(ALT 2002)
Fuzzy lattice reasoning (FLR) classifier and its application for ambient ozone estimation
International Journal of Approximate Reasoning
Genetic algorithm-based feature set partitioning for classification problems
Pattern Recognition
Genetic algorithm-based feature set partitioning for classification problems
Pattern Recognition
APPROX '08 / RANDOM '08 Proceedings of the 11th international workshop, APPROX 2008, and 12th international workshop, RANDOM 2008 on Approximation, Randomization and Combinatorial Optimization: Algorithms and Techniques
Property Testing: A Learning Theory Perspective
Foundations and Trends® in Machine Learning
Discrete Applied Mathematics
On derandomization and average-case complexity of monotone functions
Theoretical Computer Science
A complete characterization of statistical query learning with applications to evolvability
Journal of Computer and System Sciences
Linguistic categorization and complexity
SIGMORPHON '12 Proceedings of the Twelfth Meeting of the Special Interest Group on Computational Morphology and Phonology
Learnability of DNF with representation-specific queries
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Hi-index | 0.01 |
Efficient distribution-free learning of Boolean formulas from positive and negative examples is considered. It is shown that classes of formulas that are efficiently learnable from only positive examples or only negative examples have certain closure properties. A new substitution technique is used to show that in the distribution-free case learning DNF (disjunctive normal form formulas) is no harder than learning monotone DNF. We prove that monomials cannot be efficiently learned from negative examples alone, even if the negative examples are uniformly distributed. It is also shown that, if the examples are drawn from uniform distributions, then the class of DNF in which each variable occurs at most once is efficiently weakly learnable (i.e., individual examples are correctly classified with a probability larger than 1/2 + 1/p, where p is a polynomial in the relevant parameters of the learning problem). We then show an equivalence between the notion of weak learning and the notion of group learning, where a group of examples of polynomial size, either all positive or all negative, must be correctly classified with high probability.