Opportunities for private and secure machine learning

  • Authors:
  • Christopher W. Clifton

  • Affiliations:
  • Purdue University, West Lafayette, IN, USA

  • Venue:
  • Proceedings of the 1st ACM workshop on Workshop on AISec
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

While the interplay of Artificial Intelligence and Security covers a wide variety of topics, the 2008 AISec program largely focuses on use of artificial intelligence techniques to aid with traditional security concerns: intrusion detection, security policy management, malware detection, etc. This talk will address the flip side of the issue: Using machine learning on sensitive data. The privacy-preserving data mining literature provides numerous solutions to machine learning on sensitive data, while protecting the data from disclosure. Unfortunately, privacy has yet to provide the economic incentives for commercial development of this technology. This talk will survey this work (and open challenges) in light of problems that may have greater incentives for development: collaborative machine learning by parties that do not fully trust each other. Opportunities include job brokerage (assigning jobs in ways that most efficiently utilize resources of competing companies), supply chain optimization, inter-agency data sharing, etc. Techniques similar to those in privacy-preserving data mining can enable such applications without the degree of information disclosure and trust currently required, providing a business model for development of the technology (and as a by-product, reducing the number of trusted systems that need to be secured.)