Enhancing one-class support vector machines for unsupervised anomaly detection

  • Authors:
  • Mennatallah Amer;Markus Goldstein;Slim Abdennadher

  • Affiliations:
  • German University in Cairo, Egypt;German Research Center for Artificial Intelligence (DFKI GmbH), Kaiserslautern, Germany;German University in Cairo, Egypt

  • Venue:
  • Proceedings of the ACM SIGKDD Workshop on Outlier Detection and Description
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support Vector Machines (SVMs) have been one of the most successful machine learning techniques for the past decade. For anomaly detection, also a semi-supervised variant, the one-class SVM, exists. Here, only normal data is required for training before anomalies can be detected. In theory, the one-class SVM could also be used in an unsupervised anomaly detection setup, where no prior training is conducted. Unfortunately, it turns out that a one-class SVM is sensitive to outliers in the data. In this work, we apply two modifications in order to make one-class SVMs more suitable for unsupervised anomaly detection: Robust one-class SVMs and eta one-class SVMs. The key idea of both modifications is, that outliers should contribute less to the decision boundary as normal instances. Experiments performed on datasets from UCI machine learning repository show that our modifications are very promising: Comparing with other standard unsupervised anomaly detection algorithms, the enhanced one-class SVMs are superior on two out of four datasets. In particular, the proposed eta one-class SVM has shown the most promising results.