Trust in Online Technology: Towards Practical Guidelines Based on Experimentally Verified Theory

  • Authors:
  • Christian Detweiler;Joost Broekens

  • Affiliations:
  • Man-Machine Interaction Group, Delft University of Technology, Delft, The Netherlands;Man-Machine Interaction Group, Delft University of Technology, Delft, The Netherlands

  • Venue:
  • Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A large amount of research attempts to define trust, yet relatively little research attempts to experimentally verify what makes trust needed in interactions with humans and technology. In this paper we identify the underlying elements of trust-requiring situations: (a) goals that involve dependence on another, (b) a perceived lack of control over the other, (c) uncertainty regarding the ability of the other, and (d) uncertainty regarding the benevolence of the other. Then, we propose a model of the interaction of these elements. We argue that this model can explain why certain situations require trust. To test the applicability of the proposed model to an instance of human-technology interaction, we constructed a website which required subjects to depend on an intelligent software agent to accomplish a task. A strong correlation was found between subjects' level of trust in the software and the ability they perceived the software as having. Strong negative correlations were found between perceived risk and perceived ability, and between perceived risk and trust.