On the measure of conflicts: Shapley Inconsistency Values

  • Authors:
  • Anthony Hunter;Sébastien Konieczny

  • Affiliations:
  • Department of Computer Science, University College London, UK;CRIL - CNRS, Université d'Artois, France

  • Venue:
  • Artificial Intelligence
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

There are relatively few proposals for inconsistency measures for propositional belief bases. However inconsistency measures are potentially as important as information measures for artificial intelligence, and more generally for computer science. In particular, they can be useful to define various operators for belief revision, belief merging, and negotiation. The measures that have been proposed so far can be split into two classes. The first class of measures takes into account the number of formulae required to produce an inconsistency: the more formulae required to produce an inconsistency, the less inconsistent the base. The second class takes into account the proportion of the language that is affected by the inconsistency: the more propositional variables affected, the more inconsistent the base. Both approaches are sensible, but there is no proposal for combining them. We address this need in this paper: our proposal takes into account both the number of variables affected by the inconsistency and the distribution of the inconsistency among the formulae of the base. Our idea is to use existing inconsistency measures in order to define a game in coalitional form, and then to use the Shapley value to obtain an inconsistency measure that indicates the responsibility/contribution of each formula to the overall inconsistency in the base. This allows us to provide a more reliable image of the belief base and of the inconsistency in it.