The nature of statistical learning theory
The nature of statistical learning theory
Support Vector Data Description
Machine Learning
Editorial: brain-computer interfaces towards practical implementations and potential applications
Computational Intelligence and Neuroscience - Brain-Computer Interfaces: Towards Practical Implementations and Potential Applications
A Small Sphere and Large Margin Approach for Novelty Detection Using Training Data with Outliers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
Support vector machine (SVM) and support vector data description (SVDD) are the well-known kernel-based methods for pattern classification. SVM constructs an optimal hyperplane whereas SVDD constructs an optimal hypersphere to separate data between two classes. SVM and SVDD have been compared in pattern classification experiments, however there is no theoretical work on comparison of these methods. This paper presents a new theoretical model to unify SVM and SVDD. The proposed model constructs two optimal points which can be transformed to hyperplane or hypersphere. Therefore SVM and SVDD are regarded as special cases of this proposed model. We applied the proposed model to analyse the dataset III for motor imagery problem in BCI Competition II and achieved promising results.