One-class svms for document classification
The Journal of Machine Learning Research
Extreme re-balancing for SVMs: a case study
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Efficient performance estimate for one-class support vector machine
Pattern Recognition Letters
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Parameter optimization of Kernel-based one-class classifier on imbalance text learning
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Hi-index | 0.00 |
Compared with conventional two-class learning schemes, one-class classification simply uses a single class for training purposes. Applying one-class classification to the minorities in an imbalanced data has been shown to achieve better performance than the two-class one. In this paper, in order to make the best use of all the available information during the learning procedure, we propose a general framework which first uses the minority class for training in the one-class classification stage; and then uses both minority and majority class for estimating the generalization performance of the constructed classifier. Based upon this generalization performance measurement, parameter search algorithm selects the best parameter settings for this classifier. Experiments on UCI and Reuters text data show that one-class SVM embedded in this framework achieves much better performance than the standard one-class SVM alone and other learning schemes, such as one-class Naive Bayes, one-class nearest neighbour and neural network.