Trust decision-making in multi-agent systems

  • Authors:
  • Chris Burnett;Timothy J. Norman;Katia Sycara

  • Affiliations:
  • Department of Computing Science, University of Aberdeen, Scotland, UK;Department of Computing Science, University of Aberdeen, Scotland, UK;Robotics Institute, Carnegie Mellon University, Pittsburgh, PA

  • Venue:
  • IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume One
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Trust is crucial in dynamic multi-agent systems, where agents may frequently join and leave, and the structure of the society may often change. In these environments, it may be difficult for agents to form stable trust relationships necessary for confident interactions. Societies may break down when trust between agents is too low to motivate interactions. In such settings, agents should make decisions about who to interact with, given their degree of trust in the available partners. We propose a decision-theoretic model of trust decision making allows controls to be used, as well as trust, to increase confidence in initial interactions. We consider explicit incentives, monitoring and reputation as examples of such controls. We evaluate our approach within a simulated, highly-dynamic multi-agent environment, and show how this model supports the making of delegation decisions when trust is low.