Integrating utility into face de-identification

  • Authors:
  • Ralph Gross;Edoardo Airoldi;Bradley Malin;Latanya Sweeney

  • Affiliations:
  • Data Privacy Laboratory, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA;Data Privacy Laboratory, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA;Data Privacy Laboratory, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA;Data Privacy Laboratory, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA

  • Venue:
  • PET'05 Proceedings of the 5th international conference on Privacy Enhancing Technologies
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the proliferation of inexpensive video surveillance and face recognition technologies, it is increasingly possible to track and match people as they move through public spaces. To protect the privacy of subjects visible in video sequences, prior research suggests using ad hoc obfuscation methods, such as blurring or pixelation of the face. However, there has been little investigation into how obfuscation influences the usability of images, such as for classification tasks. In this paper, we demonstrate that at high obfuscation levels, ad hoc methods fail to preserve utility for various tasks, whereas at low obfuscation levels, they fail to prevent recognition. To overcome the implied tradeoff between privacy and utility, we introduce a new algorithm, k-Same-Select, which is a formal privacy protection schema based on k-anonymity that provably protects privacy and preserves data utility. We empirically validate our findings through evaluations on the FERET database, a large real world dataset of facial images.