Hybrid sampling on mutual information entropy-based clustering ensembles for optimizations

  • Authors:
  • Feng Wang;Cheng Yang;Zhiyi Lin;Yuanxiang Li;Yuan Yuan

  • Affiliations:
  • State Key Lab of Software Engineering, Wuhan University, Wuhan, China and State Key Lab of Software Engineering, Wuhan University, Wuhan, China;State Key Lab of Software Engineering, Wuhan University, Wuhan, China;Guangdong University of Technology, Guangzhou, China;State Key Lab of Software Engineering, Wuhan University, Wuhan, China;School of Engineering and Applied Science, Aston University, Birmingham B4 7ET, UK

  • Venue:
  • Neurocomputing
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we focus on the design of bivariate EDAs for discrete optimization problems and propose a new approach named HSMIEC. While the current EDAs require much time in the statistical learning process as the relationships among the variables are too complicated, we employ the Selfish gene theory (SG) in this approach, as well as a Mutual Information and Entropy based Cluster (MIEC) model is also set to optimize the probability distribution of the virtual population. This model uses a hybrid sampling method by considering both the clustering accuracy and clustering diversity and an incremental learning and resample scheme is also set to optimize the parameters of the correlations of the variables. Compared with several benchmark problems, our experimental results demonstrate that HSMIEC often performs better than some other EDAs, such as BMDA, COMIT, MIMIC and ECGA.