Personal privacy through understanding and action: five pitfalls for designers

  • Authors:
  • Scott Lederer;Jason I. Hong;Anind K. Dey;James A. Landay

  • Affiliations:
  • Group for User Interface Research, Computer Science Division, University of California, USA;University of California, Berkeley;Group for User Interface Research, Computer Science Division, University of California and Intel Research, Berkeley, CA, USA;DUB Group, Department of Computer Science and Engineering, University of Washington and Intel Research, Seattle, WA, USA

  • Venue:
  • Personal and Ubiquitous Computing
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

To participate in meaningful privacy practice in the context of technical systems, people require opportunities to understand the extent of the systems’ alignment with relevant practice and to conduct discernible social action through intuitive or sensible engagement with the system. It is a significant challenge to design for such understanding and action through the feedback and control mechanisms of today’s devices. To help designers meet this challenge, we describe five pitfalls to beware when designing interactive systems—on or off the desktop—with personal privacy implications. These pitfalls are: (1) obscuring potential information flow, (2) obscuring actual information flow, (3) emphasizing configuration over action, (4) lacking coarse-grained control, and (5) inhibiting existing practice. They are based on a review of the literature, on analyses of existing privacy-affecting systems, and on our own experiences in designing a prototypical user interface for managing privacy in ubiquitous computing. We illustrate how some existing research and commercial systems—our prototype included—fall into these pitfalls and how some avoid them. We suggest that privacy-affecting systems that heed these pitfalls can help users appropriate and engage them in alignment with relevant privacy practice.