Nonmonotonic reasoning, preferential models and cumulative logics
Artificial Intelligence
An entropy-based learning algorithm of Bayesian conditional trees
UAI '92 Proceedings of the eighth conference on Uncertainty in Artificial Intelligence
The uncertain reasoner's companion: a mathematical perspective
The uncertain reasoner's companion: a mathematical perspective
Qualitative probabilities for default reasoning, belief revision, and causal modeling
Artificial Intelligence
Nonmonotonic reasoning, conditional objects and possibility theory
Artificial Intelligence
Bayesian networks for knowledge discovery
Advances in knowledge discovery and data mining
Characterizing the principle of minimum cross-entropy within a conditional-logical framework
Artificial Intelligence
ACM Computing Surveys (CSUR)
A Guide to the Literature on Learning Probabilistic Networks from Data
IEEE Transactions on Knowledge and Data Engineering
Handling Conditionals Adequately in Uncertain Reasoning
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Following Conditional Structures of Knowledge
KI '99 Proceedings of the 23rd Annual German Conference on Artificial Intelligence: Advances in Artificial Intelligence
A Thorough Axiomatization of a Principle of Conditional Preservation in Belief Revision
Annals of Mathematics and Artificial Intelligence
A big-stepped probability approach for discovering default rules
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems - Intelligent information systems
Modelling conditional knowledge discovery and belief revision by abstract state machines
ASM'03 Proceedings of the abstract state machines 10th international conference on Advances in theory and practice
Hi-index | 0.00 |
Knowledge discovery and data mining deal with the task of finding useful information and especially rules in unstructured data. Most knowledge discovery approaches associate conditional probabilities to discovered rules in order to specify their strength. In this paper, we propose a qualitative approach to knowledge discovery. We do so by abstracting from actual probabilities to qualitative information and in particular, by developing a method for the computation of an ordinal conditional function from a possibly noisy probability distribution. The link between structural and numerical knowledge is established by a powerful algebraic theory of conditionals. By applying this theory, we develop an algorithm that computes sets of default rules from the qualitative abstraction of the input distribution. In particular, we show how sparse information can be dealt with appropriately in our framework. By making use of the duality between inductive reasoning and knowledge discovery within the algebraic theory of conditionals, we can ensure that the discovered rules can be considered as being most informative in a strict, formal sense.