OWLIM – a pragmatic semantic repository for OWL

  • Authors:
  • Atanas Kiryakov;Damyan Ognyanov;Dimitar Manov

  • Affiliations:
  • Ontotext Lab, Sirma Group Corp., Sofia, Bulgaria;Ontotext Lab, Sirma Group Corp., Sofia, Bulgaria;Ontotext Lab, Sirma Group Corp., Sofia, Bulgaria

  • Venue:
  • WISE'05 Proceedings of the 2005 international conference on Web Information Systems Engineering
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

OWLIM is a high-performance Storage and Inference Layer (SAIL) for Sesame, which performs OWL DLP reasoning, based on forward-chaining of entilement rules. The reasoning and query evaluation are performed in-memory, while in the same time OWLIM provides a reliable persistence, based on N-Triples files. This paper presents OWLIM, together with an evaluation of its scalability over synthetic, but realistic, dataset encoded with respect to PROTON ontology. The experiment demonstrates that OWLIM can scale to millions of statements even on commodity desktop hardware. On an almost-entry-level server, OWLIM can manage a knowledge base of 10 million explicit statements, which are extended to about 19 millions after forward chaining. The upload and storage speed is about 3,000 statement/sec. at the maximal size of the repository, but it starts at more than 18,000 (for a small repository) and slows down smoothly. As it can be expected for such an inference strategy, delete operations are expensive, taking as much as few minutes. In the same time, a variety of queries can be evaluated within milliseconds. The experiment shows that such reasoners can be efficient for very big knowledge bases, in scenarios when delete operations should not be handled in real-time.