Understanding and protecting privacy: formal semantics and principled audit mechanisms

  • Authors:
  • Anupam Datta;Jeremiah Blocki;Nicolas Christin;Henry DeYoung;Deepak Garg;Limin Jia;Dilsun Kaynar;Arunesh Sinha

  • Affiliations:
  • Carnegie Mellon University;Carnegie Mellon University;Carnegie Mellon University;Carnegie Mellon University;Max Planck Institute for Software Systems, Germany;Carnegie Mellon University;Carnegie Mellon University;Carnegie Mellon University

  • Venue:
  • ICISS'11 Proceedings of the 7th international conference on Information Systems Security
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Privacy has become a significant concern in modern society as personal information about individuals is increasingly collected, used, and shared, often using digital technologies, by a wide range of organizations. Certain information handling practices of organizations that monitor individuals' activities on the Web, data aggregation companies that compile massive databases of personal information, cell phone companies that collect and use location data about individuals, online social networks and search engines—while enabling useful services—have aroused much indignation and protest in the name of privacy. Similarly, as healthcare organizations are embracing electronic health record systems and patient portals to enable patients, employees, and business affiliates more efficient access to personal health information, there is trepidation that the privacy of patients may not be adequately protected if information handling practices are not carefully designed and enforced. Given this state of affairs, it is very important to arrive at a general understanding of (a) why certain information handling practices arouse moral indignation, what practices or policies are appropriate in a given setting, and (b) how to represent and enforce such policies using information processing systems. This article summarizes progress on a research program driven by goal (b). We describe a semantic model and logic of privacy that formalizes privacy as a right to appropriate flows of personal information—a position taken by contextual integrity, a philosphical theory of privacy for answering questions of the form identified in (a). The logic is designed with the goal of enabling specification and enforcement of practical privacy policies. It has been used to develop the first complete formalization of two US privacy laws—the HIPAA Privacy Rule that prescribes and proscribes flows of personal health information, and the Gramm-Leach-Bliley Act that similarly governs flows of personal financial information. Observing that preventive access control mechanisms are not sufficient to enforce such privacy policies, we develop two complementary audit mechanisms for policy enforcement. These mechanisms enable auditing of practical privacy policies, including the entire HIPAA Privacy Rule. The article concludes with a vision for further research in this area.