Artificial Intelligence - Special volume on qualitative reasoning about physical systems
Using explicit ontologies in KBS development
International Journal of Human-Computer Studies
Knowledge representation: logical, philosophical and computational foundations
Knowledge representation: logical, philosophical and computational foundations
Causality: models, reasoning, and inference
Causality: models, reasoning, and inference
Causal Al Models: Steps toward Applications
Causal Al Models: Steps toward Applications
Formal Theories of the Commonsense World
Formal Theories of the Commonsense World
Sweetening Ontologies with DOLCE
EKAW '02 Proceedings of the 13th International Conference on Knowledge Engineering and Knowledge Management. Ontologies and the Semantic Web
Teaching case-based argumentation through a model and examples
Teaching case-based argumentation through a model and examples
Current topics in qualitative reasoning
AI Magazine
Artificial Intelligence and Law
CAUSATIONT: modeling causation in AI&law
Law and the Semantic Web
Legal Theory, Sources of Law and the Semantic Web
Proceedings of the 2009 conference on Legal Theory, Sources of Law and the Semantic Web
Ontology RepresentationDesign Patterns and Ontologies that Make Sense
Proceedings of the 2009 conference on Ontology Representation: Design Patterns and Ontologies that Make Sense
An agent-based legal knowledge acquisition methodology for agile public administration
Proceedings of the 13th International Conference on Artificial Intelligence and Law
Hi-index | 0.00 |
In this paper, we present an approach to commonsense causal explanation of stories that can be used for automatically determining the liable party in legal case descriptions. The approach is based on LRICore, a core ontology for law that takes a commonsense perspective. Aside from our thesis that in the legal domain many terms still have a strong commonsense flavour, the descriptions of events in legal cases, as e.g. presented at judicial trials, are cast in commonsense terms as well. We present design principles for representing commonsense causation, and describe a process-based approach to automatic identification of causal relations in stories, which are described in terms of the core ontology. The resulting causal explanation forms a necessary condition for determining the liability and responsibility of agents that play a role in the case. We describe the basic architecture and working of DIRECT, the demonstrator we are constructing to test the validity of our process oriented view on commonsense causation. This view holds that causal relations are in fact abstractions constructed on the basis of our commonsense understanding of physical and mental processes.