The nature of statistical learning theory
The nature of statistical learning theory
Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
The Journal of Machine Learning Research
The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Robust pseudo-hierarchical support vector clustering
SCIA'07 Proceedings of the 15th Scandinavian conference on Image analysis
MMSVC: an efficient unsupervised learning approach for large-scale datasets
LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and simulation and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part III
Hi-index | 0.00 |
The support vector domain description is a one-class classification method that estimates the shape and extent of the distribution of a data set. This separates the data into outliers, outside the decision boundary, and inliers on the inside. The method bears close resemblance to the two-class support vector machine classifier. Recently, it was shown that the regularization path of the support vector machine is piecewise linear, and that the entire path can be computed efficiently. This paper shows that this property carries over to the support vector domain description. Using our results the solution to the one-class classification can be obtained for any amount of regularization with roughly the same computational complexity required to solve for a particularly value of the regularization parameter. The possibility of evaluating the results for any amount of regularization not only offers more accurate and reliable models, but also makes way for new applications. We illustrate the potential of the method by determining the order of inclusion in the model for a set of corpora callosa outlines.