Information Processing Letters
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Rectangular partition is polynomial in two dimensions but NP-complete in three
Information Processing Letters
Equivalence of models for polynomial learnability
Information and Computation
Learning sparse multivariate polynomials over a field with queries and counterexamples
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On-Line Learning of Rectangles and Unions of Rectangles
Machine Learning - Special issue on computational learning theory, COLT'92
Exact learning Boolean functions via the monotone theory
Information and Computation
Oracles and queries that are sufficient for exact learning
Journal of Computer and System Sciences
How many queries are needed to learn?
Journal of the ACM (JACM)
Learning Sat-k-DNF formulas from membership queries
STOC '96 Proceedings of the twenty-eighth annual ACM symposium on Theory of computing
On the limits of proper learnability of subclasses of DNF formulas
Machine Learning - Special issue on COLT '94
A composition theorem for learning algorithms with applications to geometric concept classes
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
A new composition theorem for learning algorithms
STOC '98 Proceedings of the thirtieth annual ACM symposium on Theory of computing
Efficient learning with virtual threshold gates
Information and Computation
Exact Learning of Discretized Geometric Concepts
SIAM Journal on Computing
Learning functions represented as multiplicity automata
Journal of the ACM (JACM)
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Exact learning of DNF formulas using DNF hypotheses
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Machine Learning
Machine Learning
Hardness of Approximating Minimization Problems
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Exact learning of DNF formulas using DNF hypotheses
Journal of Computer and System Sciences - Special issue on COLT 2002
Hi-index | 0.04 |
We study the proper learnability of axis-parallel concept classes in the PAC-learning and exact-learning models. These classes include union of boxes, DNF, decision trees and multivariate polynomials. For constant-dimensional axis-parallel concepts C we show that the following problems have time complexities that are within a polynomial factor of each other. C is α-properly exactly learnable (with hypotheses of size at most α times the target size) from membership and equivalence queries. C is α-properly PAC learnable (without membership queries) under any product distribution. There is an α-approximation algorithm for the MINEQUIC problem (given a g ∈ C find a minimal size f ∈ C that is logically equivalent to g). In particular, if one has polynomial time complexity, they all do. Using this we give the first proper-learning algorithm of constant-dimensional decision trees and the first negative results in proper learning from membership and equivalence queries for many classes. For axis-parallel concepts over a nonconstant dimension we show that with the equivalence oracle (1) ⇒ (3). We use this to show that (binary) decision trees are not properly learnable in polynomial time (assuming P ≠ NP) and DNF is not sε-properly learnable (ε 2P ≠ PNP).