Optimal Linear Combination of Neural Networks for Improving Classification Performance

  • Authors:
  • Naonori Ueda

  • Affiliations:
  • NTT Communications Science Lab, Kyoto, Japan

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2000

Quantified Score

Hi-index 0.15

Visualization

Abstract

With a focus on classification problems, this paper presents a new method for linearly combining multiple neural network classifiers based on statistical pattern recognition theory. In our approach, several neural networks are first selected based on which works best for each class in terms of minimizing classification errors. Then, they are linearly combined to form an ideal classifier that exploits the strengths of the individual classifiers. In this approach, the minimum classification error (MCE) criterion is utilized to estimate the optimal linear weights. In this formulation, because the classification decision rule is incorporated into the cost function, a more suitable better combination of weights for the classification objective could be obtained. Experimental results using artificial and real data sets show that the proposed method can construct a better combined classifier that outperforms the best single classifier in terms of overall classification errors for test data.