MICCLLR: Multiple-Instance Learning Using Class Conditional Log Likelihood Ratio

  • Authors:
  • Yasser El-Manzalawy;Vasant Honavar

  • Affiliations:
  • Department of Computer Science, Iowa State University, Ames, USA 50011-1040 and Systems and Computer Engineering, Al-Azhar University, Cairo, Egypt;Department of Computer Science, Iowa State University, Ames, USA 50011-1040

  • Venue:
  • DS '09 Proceedings of the 12th International Conference on Discovery Science
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multiple-instance learning (MIL) is a generalization of the supervised learning problem where each training observation is a labeled bag of unlabeled instances. Several supervised learning algorithms have been successfully adapted for the multiple-instance learning settings. We explore the adaptation of the Naive Bayes (NB) classifier and the utilization of its sufficient statistics for developing novel multiple-instance learning methods. Specifically, we introduce MICCLLR (multiple-instance class conditional log likelihood ratio), a method for mapping each bag of instances as a single meta-instance using class conditional log likelihood ratio statistics such that any supervised base classifier can be applied to the meta-data. The results of our experiments with MICCLLR using different base classifiers suggest that no single base classifier consistently outperforms other base classifiers on all data sets. We show that a substantial improvement in performance is obtained using an ensemble of MICCLLR classifiers trained using different base learners. We also show that an extra gain in classification accuracy is obtained by applying AdaBoost.M1 to weak MICCLLR classifiers. Overall, our results suggest that the predictive performance of the three proposed variants of MICCLLR are competitive to some of the state-of-the-art MIL methods.