Evaluating interfaces for privacy policy rule authoring

  • Authors:
  • Clare-Marie Karat;John Karat;Carolyn Brodie;Jinjuan Feng

  • Affiliations:
  • IBM T.J. Watson Research Center;IBM T.J. Watson Research Center;IBM T.J. Watson Research Center;Towson University

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Privacy policy rules are often written in organizations by a team of people in different roles. Currently, people in these roles have no technological tools to guide the creation of clear and implementable high-quality privacy policy rules. High-quality privacy rules can be the basis for verifiable automated privacy access decisions. An empirical study was conducted with 36 users who were novices in privacy policy authoring to evaluate the quality of rules created and user satisfaction with two experimental privacy authoring tools and a control condition. Results show that users presented with scenarios were able to author significantly higher quality rules using either the natural language with a privacy rule guide tool or a structured list tool as compared to an unguided natural language control condition. The significant differences in quality were found in both user self-ratings of rule quality and objective quality scores. Users ranked the two experimental tools significantly higher than the control condition. Implications of the research and future research directions are discussed.