Machine Learning
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Estimating the Support of a High-Dimensional Distribution
Neural Computation
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Combining different biometric traits with one-class classification
Signal Processing
Parameter optimization of Kernel-based one-class classifier on imbalance text learning
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Parameter estimation of one-class SVM on imbalance text classification
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
Calculation of melatonin and resveratrol effects on steatosis hepatis using soft computing methods
Computer Methods and Programs in Biomedicine
Hi-index | 0.10 |
This letter proposes and analyzes a method (@x@a@r-estimate) to estimate the generalization performance of one-class support vector machine (SVM) for novelty detection. The method is an extended version of the @x@a-estimate method, which is used to estimate the generalization performance of standard SVM for classification. Our method is derived from analyzing the connection between one-class SVM and standard SVM. Without any computation intensive re-sampling, the method is computationally much more efficient than leave-one-out method, since it can be computed immediately from the decision function of one-class SVM. Using our method to estimate the error rate is more precise than using the fraction of support vectors and a parameter @n of one-class SVM. We also propose that the fraction of support vectors characterizes the precision of one-class SVM. A theoretical analysis and experiments on an artificial data and a widely known handwritten digit recognition set (MNIST) show that our method can effectively estimate the generalization performance of one-class SVM for novelty detection.