Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Shrinking the tube: a new support vector regression algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
The Journal of Machine Learning Research
An Improved Cluster Labeling Method for Support Vector Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering via minimum volume ellipsoids
Computational Optimization and Applications
Hi-index | 0.00 |
The Support Vector Data Description (SVDD) has been introduced to address the problem of anomaly (or outlier) detection. It essentially fits the smallest possible sphere around the given data points, allowing some points to be excluded as outliers. Whether or not a point is excluded, is governed by a slack variable. Mathematically, the values for the slack variables are obtained by minimizing a cost function that balances the size of the sphere against the penalty associated with outliers. In this paper we argue that the SVDD slack variables lack a clear geometric meaning, and we therefore re-analyze the cost function to get a better insight into the characteristics of the solution. We also introduce and analyze two new definitions of slack variables and show that one of the proposed methods behaves more robustly with respect to outliers, thus providing tighter bounds compared to SVDD.