On the efficiency of subsumption algorithms

  • Authors:
  • G. Gottlob;A. Leitsch

  • Affiliations:
  • Politecnico di Milano, Dipartimento di Elettronica, 20133, Milano-Piazza Leonardo Da Vinci, 32, Italy and Technische Universität Wien, Vienna, Austria;Institut für Statistik und Mathematik, Wirtschaftsuniversität Wien, Augasse 2-6, 1090 Vienna, Austria

  • Venue:
  • Journal of the ACM (JACM)
  • Year:
  • 1985

Quantified Score

Hi-index 0.00

Visualization

Abstract

The costs of subsumption algorithms are analyzed by an estimation of the maximal number of unification attempts (worst-case unification complexity) made for deciding whether a clause C subsumes a clause D. For this purpose the clauses C and D are characterized by the following parameters: number of variables in C, number of literals in C, number of literals in D, and maximal length of the literals. The worst-case unification complexity immediately yields a lower bound for the worst-case time complexity.First, two well-known algorithms (Chang-Lee, Stillman) are investigated. Both algorithms are shown to have a very high worst-case time complexity. Then, a new subsumption algorithm is defined, which is based on an analysis of the connection between variables and predicates in C. An upper bound for the worst-case unification complexity of this algorithm, which is much lower than the lower bounds for the two other algorithms, is derived. Examples in which exponential costs are reduced to polynomial costs are discussed. Finally, the asymptotic growth of the worst-case complexity for all discussed algorithms is shown in a table (for several combinations of the parameters).