Privacy risk models for designing privacy-sensitive ubiquitous computing systems

  • Authors:
  • Jason I. Hong;Jennifer D. Ng;Scott Lederer;James A. Landay

  • Affiliations:
  • University of California, Berkeley, Berkeley, CA;University of California, Berkeley, Berkeley, CA;University of California, Berkeley, Berkeley, CA;University of Washington, Seattle, WA

  • Venue:
  • DIS '04 Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Privacy is a difficult design issue that is becoming increasingly important as we push into ubiquitous computing environments. While there is a fair amount of theoretical work on designing for privacy, there are few practical methods for helping designers create applications that provide end-users with a reasonable level of privacy protection that is commensurate with the domain, with the community of users, and with the risks and benefits to all stakeholders in the intended system. Towards this end, we propose privacy risk models as a general method for refining privacy from an abstract concept into concrete issues for specific applications and prioritizing those issues. In this paper, we introduce a privacy risk model we have developed specifically for ubiquitous computing, and outline two case studies describing our use of this privacy risk model in the design of two ubiquitous computing applications.