Trusting to learn: trust and privacy issues in serious games

  • Authors:
  • Miguel Malheiros;Charlene Jennett;Will Seager;M. Angela Sasse

  • Affiliations:
  • Dept. of Computer Science, University College London, UK;Dept. of Computer Science, University College London, UK;Dept. of Computer Science, University College London, UK;Dept. of Computer Science, University College London, UK

  • Venue:
  • TRUST'11 Proceedings of the 4th international conference on Trust and trustworthy computing
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Organizations are increasingly investing in technology-enhanced learning systems to improve their employees' skills. Serious games are one example; the competitive and fun nature of games is supposed to motivate employee participation. But any system that records employee data raises issues of privacy and trust. In this paper, we present a study on privacy and trust implications of serious games in an organizational context. We present findings from 32 interviews with potential end-users of a serious games platform called TARGET. A qualitative analysis of the interviews reveals that participants anticipate privacy risks for the data generated in game playing, and their decision to trust their fellow employees and managers depends on the presence of specific trust signals. Failure to minimize privacy risks and maximize trust will affect the acceptance of the system and the learning experience - thus undermining the primary purpose for which it was deployed. Game designers are advised to provide mechanisms for selective disclosure of data by players, and organizations should not use gaming data for appraisal or selection purposes, and clearly communicate this to employees.