The dominant robot: threatening robots cause psychological reactance, especially when they have incongruent goals

  • Authors:
  • M. A. J. Roubroeks;J. R. C. Ham;C. J. H. Midden

  • Affiliations:
  • Department of Human-Technology Interaction, Eindhoven University of Technology;Department of Human-Technology Interaction, Eindhoven University of Technology;Department of Human-Technology Interaction, Eindhoven University of Technology

  • Venue:
  • PERSUASIVE'10 Proceedings of the 5th international conference on Persuasive Technology
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Persuasive technology can take the form of a social agent that persuades people to change behavior or attitudes. However, like any persuasive technology, persuasive social agents might trigger psychological reactance, which can lead to restoration behavior. The current study investigated whether interacting with a persuasive robot can cause psychological reactance. Additionally, we investigated whether goal congruency plays a role in psychological reactance. Participants programmed a washing machine while a robot gave threatening advice. Confirming expectations, participants experienced more psychological reactance when receiving high-threatening advice compared to low-threatening advice. Moreover, when the robot gave high-threatening advice and expressed an incongruent goal, participants reported the highest level of psychological reactance (on an anger measure). Finally, high-threatening advice led to more restoration, and this relationship was partially mediated by psychological reactance. Overall, results imply that under certain circumstances persuasive technology can trigger opposite effects, especially when people have incongruent goal intentions.