Automatically Adapting a Trained Anomaly Detector to Software Patches

  • Authors:
  • Peng Li;Debin Gao;Michael K. Reiter

  • Affiliations:
  • Department of Computer Science, University of North Carolina, Chapel Hill, USA;School of Information Systems, Singapore Management University, Singapore;Department of Computer Science, University of North Carolina, Chapel Hill, USA

  • Venue:
  • RAID '09 Proceedings of the 12th International Symposium on Recent Advances in Intrusion Detection
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

In order to detect a compromise of a running process based on it deviating from its program's normal system-call behavior, an anomaly detector must first be trained with traces of system calls made by the program when provided clean inputs. When a patch for the monitored program is released, however, the system call behavior of the new version might differ from that of the version it replaces, rendering the anomaly detector too inaccurate for monitoring the new version. In this paper we explore an alternative to collecting traces of the new program version in a clean environment (which may take effort to set up), namely adapting the anomaly detector to accommodate the differences between the old and new program versions. We demonstrate that this adaptation is feasible for such an anomaly detector, given the output of a state-of-the-art binary difference analyzer. Our analysis includes both proofs of properties of the adapted detector, and empirical evaluation of adapted detectors based on four software case studies.