Global/local hybrid learning of mixture-of-experts from labeled and unlabeled data

  • Authors:
  • Jong-Won Yoon;Sung-Bae Cho

  • Affiliations:
  • Dept. of Computer Science, Yonsei University, Seoul, Korea;Dept. of Computer Science, Yonsei University, Seoul, Korea

  • Venue:
  • HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part I
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The mixture-of-experts (ME) models can be useful to solve complicated classification problems in real world. However, in order to train the ME model with not only labeled data but also unlabeled data which are easier to come, a new learning algorithm that considers characteristics of the ME model is required. We proposed global-local co-training (GLCT), the hybrid training method of the ME model training method for supervised learning (SL) and the co-training, which trains the ME model in semi-supervised learning (SSL) manner. GLCT uses a global model and a local model together since using the local model only shows low accuracy due to lack of labeled training data. The models enlarge the labeled data set from the unlabeled one and are trained from it by supplementing each other. To evaluate the method, we performed experiments using benchmark data sets from UCI machine learning repository. As the result, GLCT confirmed the feasibility of itself. Moreover, a comparison experiments to show the excellences of GLCT showed better performance than the other alternative method.