Machine Learning
Geometry and invariance in kernel based methods
Advances in kernel methods
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
Support Vector Data Description
Machine Learning
Multiclass reduced-set support vector machines
ICML '06 Proceedings of the 23rd international conference on Machine learning
Kernel PCA for novelty detection
Pattern Recognition
Neural Computation
Letters: Compact multi-class support vector machine
Neurocomputing
Expert Systems with Applications: An International Journal
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
The pre-image problem in kernel methods
IEEE Transactions on Neural Networks
Density-Induced Support Vector Data Description
IEEE Transactions on Neural Networks
Face Recognition Using Total Margin-Based Adaptive Fuzzy Support Vector Machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support vector data description (SVDD) has become a very attractive kernel method due to its good results in many novelty detection problems. Similar to the support vector machine (SVM), the decision function of SVDD is also expressed in terms of the kernel expansion, which results in a run-time complexity linear in the number of support vectors. For applications where fast real-time response is needed, how to speed up the decision function is crucial. A fast SVDD (F-SVDD) algorithm is presented to deal with this issue. In F-SVDD, we first discover several important geometric properties in the feature space induced by the Gaussian kernel, and then solve the preimage problem for the agent of the SVDD sphere center based on the properties. The kernel expansion can thus be compressed into one with only one term, and the run-time complexity of the F-SVDD decision function is no longer linear in the support vectors, but is a constant. Results are very encouraging.