Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts

  • Authors:
  • Saeed Masoudnia;Reza Ebrahimpour;Seyed Ali Arani

  • Affiliations:
  • School of Mathematics, Statistics and Computer Science, University of Tehran, Tehran, Iran;Brain & Intelligent Systems Research Laboratory, Department of Electrical and Computer Engineering, Shahid Rajaee Teacher Training University, Tehran, Iran and School of Cognitive Sciences (SCS), ...;Brain & Intelligent Systems Research Laboratory, Department of Electrical and Computer Engineering, Shahid Rajaee Teacher Training University, Tehran, Iran

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods.