Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
One-class svms for document classification
The Journal of Machine Learning Research
Extreme re-balancing for SVMs: a case study
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
A Consistency-Based Model Selection for One-Class Classification
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
Efficient performance estimate for one-class support vector machine
Pattern Recognition Letters
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Parameter estimation of one-class SVM on imbalance text classification
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
On-line anomaly detection and resilience in classifier ensembles
Pattern Recognition Letters
Hi-index | 0.00 |
Applying one-class classification to the minorities in an imbalance data has been shown to have the potential to achieve better performance than conventional learning schemes. Parameter optimization is a significant issue when the one-class classifier is sensitive to the parameters. For one-class learning scheme with the kernel function as one-class SVM and SVDD, besides the parameters involved in the kernel, the rejection rate is another one-class specific parameter. In this paper, we proposed an improved framework in which the minority target class is used first for learning in the classification stage; then both minority and majority class are employed for estimating the generalization performance. This performance is set as the optimization criteria. Experiments on UCI and Reuters text data show that both of the parameter optimized one-class classifiers outperform the other standard one-class learning schemes.