Communications of the ACM
Computational limitations on learning from examples
Journal of the ACM (JACM)
Performance guarantees on a sweep-line heuristic for covering rectilinear polygons with rectangles
SIAM Journal on Discrete Mathematics
Rectangular partition is polynomial in two dimensions but NP-complete in three
Information Processing Letters
Learning sparse multivariate polynomials over a field with queries and counterexamples
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On-Line Learning of Rectangles and Unions of Rectangles
Machine Learning - Special issue on computational learning theory, COLT'92
Exact learning Boolean functions via the monotone theory
Information and Computation
Simple learning algorithms using divide and conquer
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Oracles and queries that are sufficient for exact learning
Journal of Computer and System Sciences
How many queries are needed to learn?
Journal of the ACM (JACM)
Learning Sat-k-DNF formulas from membership queries
STOC '96 Proceedings of the twenty-eighth annual ACM symposium on Theory of computing
On the limits of proper learnability of subclasses of DNF formulas
Machine Learning - Special issue on COLT '94
The hardness of approximate optima in lattices, codes, and systems of linear equations
Journal of Computer and System Sciences - Special issue: papers from the 32nd and 34th annual symposia on foundations of computer science, Oct. 2–4, 1991 and Nov. 3–5, 1993
A composition theorem for learning algorithms with applications to geometric concept classes
STOC '97 Proceedings of the twenty-ninth annual ACM symposium on Theory of computing
A new composition theorem for learning algorithms
STOC '98 Proceedings of the thirtieth annual ACM symposium on Theory of computing
Efficient learning with virtual threshold gates
Information and Computation
Exact Learning of Discretized Geometric Concepts
SIAM Journal on Computing
Learning functions represented as multiplicity automata
Journal of the ACM (JACM)
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Exact learning of DNF formulas using DNF hypotheses
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Machine Learning
Machine Learning
Hardness of Approximating Minimization Problems
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Hi-index | 0.00 |
We study the proper learnability of axis parallel concept classes in the PAC learning model and in the exact learning model with membership and equivalence queries. These classes include union of boxes, DNF, decision trees and multivariate polynomials.For the constant dimensional axis parallel concepts C we show that the following problems have the same time complexity 1. C is 驴-properly exactly learnable (with hypotheses of size at most a times the target size) from membership and equivalence queries. 2. C is 驴-properly PAC learnable (without membership queries) under any product distribution. 3. There is an 驴-approximation algorithm for the MINEQUIC problem. (given a g 驴 C find a minimal size f 驴 C that is equivalent to g).In particular, C is 驴-properly learnable in poly time from membership and equivalence queries if and only if C is 驴-properly PAC learnable in poly time under the product distribution if and only if MINEQUIC has a poly time 驴-approximation algorithm. Using this result we give the first proper learning algorithm of decision trees over the constant dimensional domain and the first negative results in proper learning from membership and equivalence queries for many classes.For the non-constant dimensional axis parallel concepts we show that with the equivalence oracle (1) 驴 (3). We use this to show that (binary) decision trees are not properly learnable in polynomial time (assuming P驴NP) and DNF is not s驴-properly learnable (驴 2p 驴 PNP).