Trust and deception in virtual societies
Computer monitoring: benefits and pitfalls facing management
Information and Management
On-line trust: concepts, evolving themes, a model
International Journal of Human-Computer Studies - Special issue: Trust and technology
Personal privacy through understanding and action: five pitfalls for designers
Personal and Ubiquitous Computing
The mechanics of trust: a framework for research and design
International Journal of Human-Computer Studies
Clarifying the effects of internet monitoring on job attitudes: the mediating role of employee trust
Information and Management
End-user privacy in human-computer interaction
Foundations and Trends in Human-Computer Interaction
Privacy enhanced personalization in e-learning
Proceedings of the 2006 International Conference on Privacy, Security and Trust: Bridge the Gap Between PST Technologies and Business Services
e-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning
OCSC '09 Proceedings of the 3d International Conference on Online Communities and Social Computing: Held as Part of HCI International 2009
Privacy provision in e-learning standardized systems: status and improvements
Computer Standards & Interfaces
Hi-index | 0.00 |
Organizations are increasingly investing in technology-enhanced learning systems to improve their employees' skills. Serious games are one example; the competitive and fun nature of games is supposed to motivate employee participation. But any system that records employee data raises issues of privacy and trust. In this paper, we present a study on privacy and trust implications of serious games in an organizational context. We present findings from 32 interviews with potential end-users of a serious games platform called TARGET. A qualitative analysis of the interviews reveals that participants anticipate privacy risks for the data generated in game playing, and their decision to trust their fellow employees and managers depends on the presence of specific trust signals. Failure to minimize privacy risks and maximize trust will affect the acceptance of the system and the learning experience - thus undermining the primary purpose for which it was deployed. Game designers are advised to provide mechanisms for selective disclosure of data by players, and organizations should not use gaming data for appraisal or selection purposes, and clearly communicate this to employees.