Communications of the ACM
3rd annual symposium on theoretical aspects of computer science on STACS 86
Identifying μ-formula decision trees with queries
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Learning read-once formulas using membership queries
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Concept learning and heuristic classification in weak-theory domains
Artificial Intelligence
Unknown attribute values in induction
Proceedings of the sixth international workshop on Machine learning
Symbolic Boolean manipulation with ordered binary-decision diagrams
ACM Computing Surveys (CSUR)
Learning read-once formulas with queries
Journal of the ACM (JACM)
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Read-twice DNF formulas are properly learnable
Information and Computation
Exact learning Boolean functions via the monotone theory
Information and Computation
Oracles and queries that are sufficient for exact learning
Journal of Computer and System Sciences
Machine Learning - Special issue on inductive transfer
Learning from examples with unspecified attribute values (extended abstract)
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Learning to classify incomplete examples
Computational learning theory and natural learning systems: Volume IV
On Restricted-Focus-of-Attention Learnability of Boolean Functions
Machine Learning - Special issue on the ninth annual conference on computational theory (COLT '96)
Learning with restricted focus of attention
Journal of Computer and System Sciences
On learning in the presence of unspecified attribute values
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Learning to Reason with a Restricted View
Machine Learning
Structural results about exact learning with unspecified attribute values
Journal of Computer and System Sciences - Eleventh annual conference on computational learning theory&slash;Twelfth Annual IEEE conference on computational complexity
Machine Learning
Machine Learning
The Complexity of Equivalence and Containment for Free Single Variable Program Schemes
Proceedings of the Fifth Colloquium on Automata, Languages and Programming
Learning Ordered Binary Decision Diagrams
ALT '95 Proceedings of the 6th International Conference on Algorithmic Learning Theory
Multiple-Instance Learning of Real-Valued Geometric Patterns
Annals of Mathematics and Artificial Intelligence
Explanation-based learning to recognize network malfunctions
Information-Knowledge-Systems Management
A general dimension for query learning
Journal of Computer and System Sciences
DL-FOIL Concept Learning in Description Logics
ILP '08 Proceedings of the 18th international conference on Inductive Logic Programming
IRST-BP: preposition disambiguation based on chain clarifying relationships contexts
SemEval '07 Proceedings of the 4th International Workshop on Semantic Evaluations
Partial observability and learnability
Artificial Intelligence
Induction of concepts in web ontologies through terminological decision trees
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part I
Proceedings of the 3rd Innovations in Theoretical Computer Science Conference
Concept Induction in Description Logics Using Information-Theoretic Heuristics
International Journal on Semantic Web & Information Systems
Hi-index | 0.00 |
A challenging problem within machine learning is how to make good inferences from data sets in which pieces of information are missing. While it is valuable to have algorithms that perform well for specific domains, to gain a fundamental understanding of the problem, one needs a "theory" about how to learn with incomplete data. The important contribution of such a theory is not so much the specific algorithmic results, but rather that it provides good ways of thinking about the problem formally. In this paper we introduce the unspecified attribute value (UAV) learning model as a first step towards a theoretical framework for studying the problem of learning from incomplete data in the exact learning framework.In the UAV learning model, an example x is classified positive (resp., negative) if all possible assignments for the unspecified attributes result in a positive (resp., negative) classification. Otherwise the classification given to x is "?"(for unknown). Given an example x in which some attributes are unspecified, the oracle UAV-MQ responds with the classification of x. Given a hypothesis h, the oracle UAV-EQ returns an example x (that could have unspecified attributes) for which h(x) is incorrect.We show that any class of functions learnable in Angluin's exact model using the MQ and EQ oracles is also learnable in the UAV model using the MQ and UAV-EQ oracles as long as the counterexamples provided by the UAV-EQ oracle have a logarithmic number of unspecified attributes. We also show that any class learnable in the exact model using the MQ and EQ oracles is also learnable in the UAV model using the UAV-MQ and UAV-EQ oracles as well as an oracle to evaluate a given boolean formula on an example with unspecified attributes. (For some hypothesis classes such as decision trees and unate formulas the evaluation can be done in polynomial time without an oracle.) We also study the learnability of a universal class of decision trees under the UAV model and of DNF formulas under a representation-dependent variation of the UAV model.