Parameter optimization of Kernel-based one-class classifier on imbalance text learning

  • Authors:
  • Ling Zhuang;Honghua Dai

  • Affiliations:
  • School of Engineering and Information Technology, Deakin University, VIC, Australia;School of Engineering and Information Technology, Deakin University, VIC, Australia

  • Venue:
  • PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Applying one-class classification to the minorities in an imbalance data has been shown to have the potential to achieve better performance than conventional learning schemes. Parameter optimization is a significant issue when the one-class classifier is sensitive to the parameters. For one-class learning scheme with the kernel function as one-class SVM and SVDD, besides the parameters involved in the kernel, the rejection rate is another one-class specific parameter. In this paper, we proposed an improved framework in which the minority target class is used first for learning in the classification stage; then both minority and majority class are employed for estimating the generalization performance. This performance is set as the optimization criteria. Experiments on UCI and Reuters text data show that both of the parameter optimized one-class classifiers outperform the other standard one-class learning schemes.