Reduced Pattern Training Based on Task Decomposition Using Pattern Distributor

  • Authors:
  • Sheng-Uei Guan;Chunyu Bao;TseNgee Neo

  • Affiliations:
  • Brunel Univ., Uxbridge;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Task decomposition with pattern distributor (PD) is a new task decomposition method for multilayered feedforward neural networks (NNs). Pattern distributor network is proposed that implements this new task decomposition method. We propose a theoretical model to analyze the performance of pattern distributor network. A method named reduced pattern training (RPT) is also introduced, aiming to improve the performance of pattern distribution. Our analysis and the experimental results show that RPT improves the performance of pattern distributor network significantly. The distributor module's classification accuracy dominates the whole network's performance. Two combination methods, namely, crosstalk-based combination and genetic-algorithm (GA)-based combination, are presented to find suitable grouping for the distributor module. Experimental results show that this new method can reduce training time and improve network generalization accuracy when compared to a conventional method such as constructive backpropagation or a task decomposition method such as output parallelism (OP).