Communications of the ACM
Bounded-width polynomial-size branching programs recognize exactly those languages in NC1
Journal of Computer and System Sciences - 18th Annual ACM Symposium on Theory of Computing (STOC), May 28-30, 1986
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Learning monotone DNF with an incomplete membership oracle
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Symbolic-neural systems and the use of hints for developing complex systems
International Journal of Man-Machine Studies
Learning read-once formulas with queries
Journal of the ACM (JACM)
Learning Conjunctions of Horn Clauses
Machine Learning - Computational learning theory
Efficient noise-tolerant learning from statistical queries
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Learning with malicious membership queries and exceptions (extended abstract)
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
Learning with unreliable boundary queries
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Learning internal representations
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
On learning width two branching programs (extended abstract)
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A Bayesian/Information Theoretic Model of Learning to Learn viaMultiple Task Sampling
Machine Learning - Special issue on inductive transfer
Machine Learning - Special issue on inductive transfer
Malicious Omissions and Errors in Answers to Membership Queries
Machine Learning
Machine Learning
Machine Learning
Machine Learning
A Dozen Tricks with Multitask Learning
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Rule-Injection Hints as a Means of Improving Network Performance and Learning Time
Proceedings of the EURASIP Workshop 1990 on Neural Networks
Learning Ordered Binary Decision Diagrams
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
EXPONENTIATED GRADIENT VERSUS GRADIENT DESCENT FOR LINEAR PREDICTORS
EXPONENTIATED GRADIENT VERSUS GRADIENT DESCENT FOR LINEAR PREDICTORS
Worst-case quadratic loss bounds for prediction using linear functions and gradient descent
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In most concept learningproblems considered so far by the learningtheory community, the instances are labeled by a single unknown target. However, in some situations, although the target concept may be quite complex when expressed as a function of the attribute values of the instance, it may have a simple relationship with some intermediate (yet to be learned) concepts. In such cases, it may be advantageous to learn both these intermediate concepts and the target concept in parallel, and use the intermediate concepts to enhance our approximation of the target concept.In this paper, we consider the problem of learningm ultiple interrelated concepts simultaneously. To avoid stability problem, we assume that the dependency relations amongthe concepts are not cyclical and hence can be expressed usinga directed acyclic graph (not known to the learner). We investigate this learning problem in various popular theoretical models: mistake bound model, exact learningmo del and probably approximately correct (PAC) model.