The illusion of agency: the influence of the agency of an artificial agent on its persuasive power

  • Authors:
  • Cees Midden;Jaap Ham

  • Affiliations:
  • Human-Technology Interaction, Eindhoven University of Technology, Eindhoven, The Netherlands;Human-Technology Interaction, Eindhoven University of Technology, Eindhoven, The Netherlands

  • Venue:
  • PERSUASIVE'12 Proceedings of the 7th international conference on Persuasive Technology: design for health and safety
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Artificial social agents can influence people. However, artificial social agents are not real humans, and people may ascribe less agency to them. Would the persuasive power of a social robot diminish when people ascribe only little agency to it? To investigate this question, we performed an experiment in which participants performed tasks on a washing machine and received feedback from a robot about their energy consumption (e.g., "Your energy consumption is too high"), or factual, non-social feedback. This robot was introduced to participants as (a) an avatar (that was controlled a human in all its feedback actions; high agency), or as (b) an autonomous robot (that controlled its own feedback actions; moderate agency), or as (c) a robot that produced only random feedback; low agency). Results indicated that participants consumed less energy when a robotic social agent gave them feedback than when they received non-social feedback. This behavioral effect was independent of the level of robotic agency. In contrast, a perceived agency measure indicated that the random feedback robot was ascribed the lowest agency rating. These results suggest that the persuasive power of robot behavior is independent of the extent to which the persuadee explicitly ascribes agency to the agent.