Bayesian classifiers for positive unlabeled learning

  • Authors:
  • Jiazhen He;Yang Zhang;Xue Li;Yong Wang

  • Affiliations:
  • College of Information Engineering, Northwest A&F University, China;College of Information Engineering, Northwest A&F University and State Key Laboratory for Novel Software Technology, Nanjing University, China;School of Information Technology and Electrical Engineering, The University of Queensland, Australia;School of Computer, Northwestern Polytechnical University, China

  • Venue:
  • WAIM'11 Proceedings of the 12th international conference on Web-age information management
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper studies the problem of Positive Unlabeled learning (PU learning), where positive and unlabeled examples are used for training. Naive Bayes (NB) and Tree Augmented Naive Bayes (TAN) have been extended to PU learning algorithms (PNB and PTAN). However, they require user-specified parameter, which is difficult for the user to provide in practice. We estimate this parameter following [2] by taking the "selected completely at random" assumption and reformulate these two algorithms with this assumption. Furthermore, based on supervised algorithms Averaged One-Dependence Estimators (AODE), Hidden Naive Bayes (HNB) and Full Bayesian network Classifier (FBC), we extend these algorithms to PU learning algorithms (PAODE, PHNB and PFBC respectively). Experimental results on 20 UCI datasets show that the performance of the Bayesian algorithms for PU learning are comparable to corresponding supervised ones in most cases. Additionally, PNB and PFBC are more robust against unlabeled data, and PFBC generally performs the best.