The role of trust in automation reliance

  • Authors:
  • Mary T. Dzindolet;Scott A. Peterson;Regina A. Pomranky;Linda G. Pierce;Hall P. Beck

  • Affiliations:
  • Department of Psychology, Cameron University, 2800 Gore Blvd, Lawton, OK;Department of Psychology, Cameron University, 2800 Gore Blvd, Lawton, OK;Army Research Laboratory, Bldg 3040 Room 220, Fort Sill, OK;Army Research Laboratory, Bldg 3040 Room 220, Fort Sill, OK;Department of Psychology, Appalachian State University, Boone, NC

  • Venue:
  • International Journal of Human-Computer Studies - Special issue: Trust and technology
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

A recent and dramatic increase in the use of automation has not yielded comparable improvements in performance. Researchers have found human operators often underutilize (disuse) and overly rely on (misuse) automated aids (Parasuraman and Riley, 1997). Three studies were performed with Cameron University students to explore the relationship among automation reliability, trust, and reliance. With the assistance of an automated decision aid, participants viewed slides of Fort Sill terrain and indicated the presence or absence of a camouflaged soldier. Results from the three studies indicate that trust is an important factor in understanding automation reliance decisions. Participants initially considered the automated decision aid trustworthy and reliable. After observing the automated aid make errors, participants distrusted even reliable aids, unless an explanation was provided regarding why the aid might err. Knowing why the aid might err increased trust in the decision aid and increased automation reliance, even when the trust was unwarranted. Our studies suggest a need for future research focused on understanding automation use, examining individual differences in automation reliance, and developing valid and reliable self-report measures of trust in automation.