A signal processing view on packet sampling and anomaly detection

  • Authors:
  • Daniela Brauckhoff;Kave Salamatian;Martin May

  • Affiliations:
  • Computer Engineering and Networks Laboratory, ETH Zurich, Zurich, Switzerland;Computing Department Lancaster University, Lancaster, England;Paris Research Lab, Thomson, Paris, France

  • Venue:
  • INFOCOM'10 Proceedings of the 29th conference on Information communications
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Anomaly detection methods typically operate on preprocessed traffic traces. Firstly, most traffic capturing devices today employ random packet sampling, where each packet is selected with a certain probability, to cope with increasing link speeds. Secondly, temporal aggregation, where all packets in a measurement interval are represented by their temporal mean, is applied to transform the traffic trace to the observation timescale of interest for anomaly detection. These preprocessing steps affect the temporal correlation structure of traffic that is used by anomaly detection methods such as Kalman filtering or PCA, and have thus an impact on anomaly detection performance. Prior work has analyzed how packet sampling degrades the accuracy of anomaly detection methods; however, neither theoretical explanations nor solutions to the sampling problem have been provided. This paper makes the following key contributions: (i) It provides a thorough analysis and quantification of how random packet sampling and temporal aggregation modify the signal properties by introducing noise, distortion and aliasing. (ii) We show that aliasing introduced by the aggregation step has the largest impact on the correlation structure. (iii) We further propose to replace the aggregation step with a specifically designed low-pass filter that reduces the aliasing effect. (iv) Finally, we show that with our solution applied, the performance of anomaly detection systems can be considerably improved in the presence of packet sampling.