Differentially private search log sanitization with optimal output utility

  • Authors:
  • Yuan Hong;Jaideep Vaidya;Haibing Lu;Mingrui Wu

  • Affiliations:
  • Rutgers University, Newark, NJ;Rutgers University, Newark, NJ;Santa Clara University, CA;Microsoft, Redmond, WA

  • Venue:
  • Proceedings of the 15th International Conference on Extending Database Technology
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Web search logs contain extremely sensitive data, as evidenced by the recent AOL incident. However, storing and analyzing search logs can be very useful for many purposes (i.e. investigating human behavior). Thus, an important research question is how to privately sanitize search logs. Several search log anonymization techniques have been proposed with concrete privacy models. However, in all of these solutions, the output utility of the techniques is only evaluated rather than being maximized in any fashion. Indeed, for effective search log anonymization, it is desirable to derive the outputs with optimal utility while meeting the privacy standard. In this paper, we propose utility-maximizing sanitization based on the rigorous privacy standard of differential privacy, in the context of search logs. Specifically, we utilize optimization models to maximize the output utility of the sanitization for different applications, while ensuring that the production process satisfies differential privacy. An added benefit is that our novel randomization strategy maintains the schema integrity in the output search logs. A comprehensive evaluation on real search logs validates the approach and demonstrates its robustness and scalability.