Stopping criteria for ensembles of evolutionary artificial neural networks

  • Authors:
  • Minh Ha Nguyen;Hussein A. Abbass;Robert I. McKay

  • Affiliations:
  • Artificial Life and Adaptive Robotics (A.L.A.R.) Lab, School of Information Technology and Electrical Engineering, Australian Defence Force Academy, University of New South Wales, Canberra, ACT 26 ...;Artificial Life and Adaptive Robotics (A.L.A.R.) Lab, School of Information Technology and Electrical Engineering, Australian Defence Force Academy, University of New South Wales, Canberra, ACT 26 ...;Artificial Life and Adaptive Robotics (A.L.A.R.) Lab, School of Information Technology and Electrical Engineering, Australian Defence Force Academy, University of New South Wales, Canberra, ACT 26 ...

  • Venue:
  • Design and application of hybrid intelligent systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The formation of ensemble of artificial neural networks has attracted attention of researchers in the machine learning and statistical inference domains. It has been shown that combining different neural networks can improve the generalization ability of the learning machine. One challenge is when to stop the training or evolution of the neural networks to avoid overfitting. In the literature on ensembles of Evolutionary Artificial Neural Networks (EANNs), researchers often use the surviving population at the last generation to form the ensemble. In this paper, we show that the ensemble constructed from populations given by different early stopping criteria: (i) the minimum validation fitness of the ensemble, and (ii) the minimum of the average population validation fitness, can generalize better than the ensemble of the population in the last generation. The proposition was tested on ensembles whose members are differentiated by two diversity mechanisms: (i) using negative correlation learning and (ii) using island model. The experimental results suggested that using minimum validation fitness of the ensemble as an early stopping criterion performs significantly (with 99% confidence) better than using the population in the last generation for three (using NCL) and four (using island model) out of the five datasets.