Representing continuous change in the abductive event calculus
Proceedings of the eleventh international conference on Logic programming
A framework for knowledge-based temporal abstraction
Artificial Intelligence
Handbook of Temporal Reasoning in Artificial Intelligence (Foundations of Artificial Intelligence (Elsevier))
Learning Recursive Theories in the Normal ILP Setting
Fundamenta Informaticae
Segmentation of Evolving Complex Data and Generation of Models
ICDMW '06 Proceedings of the Sixth IEEE International Conference on Data Mining - Workshops
Prediction is deduction but explanation is abduction
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
The inference of Explanations is a problem typically studied in the field of Temporal Reasoning by means of approaches related to the reasoning about action and change, which aim usually to infer statements that explain a given change. Most of proposed works are based on inferential logic mechanisms that assume the existence of a general domain knowledge. Unfortunately, the hypothesis to have a domain theory is a requirement not ever guaranteed. In this paper we face the problem from a data-driven perspective where the aim is to discover the events that can plausibly explain the change from a state to another one of the observed domain. Our approach investigates the problem by splitting it in two issues: extraction of temporal states and finding out the events. We applied the approach to the scenarios of Industrial Process Supervision and Medical Diagnosis in order to support the task of domain experts: the experimental results show interesting aspects of our proposal.