Multi-objective optimization of a stacked neural network using an evolutionary hyper-heuristic

  • Authors:
  • Renata Furtuna;Silvia Curteanu;Florin Leon

  • Affiliations:
  • Gh. Asachi" Technical University of Iasi, Department of Chemical Engineering, B-dul D. Mangeron, No. 71A, 700050, Iasi, Romania;Gh. Asachi" Technical University of Iasi, Department of Chemical Engineering, B-dul D. Mangeron, No. 71A, 700050, Iasi, Romania;"Gh. Asachi" Technical University of Iasi, Department of Computer Science and Engineering, B-dul D. Mangeron, No. 53A, 700050, Iasi, Romania

  • Venue:
  • Applied Soft Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The present paper deals with the development and optimization of a stacked neural network (SNN) through an evolutionary hyper-heuristic, called NSGA-II-QNSNN. The proposed hyper-heuristic is based on the NSGA-II (Non-dominated Sorting Genetic Algorithm - II) multi-objective optimization evolutionary algorithm which incorporates the Quasi-Newton (QN) optimization algorithm. QN is used for training each neural network from the stack. The final global optimal solution provided by NSGA-II-QNSNN algorithm is a Pareto optimal front. It represents all the equally good compromises that can be made between the structural complexity of the stacked neural network and its modelling performance. The set of decision variables, which led to obtaining the set of points in the Pareto optimal front, represents the optimum values for the parameters of the stacked neural network: the number of networks in the stack, the weights for every output of the composing networks, and the number of hidden neurons in each individual neural network. Each stacked neural network determined through the optimization process was trained and tested by applying it to a real world problem: the modelling of the polyacrylamide-based multicomponent hydrogels synthesis. The neural modelling established the influence of the reaction conditions on the reaction yield and the swelling degree. The results provided by NSGA-II-QNSNN were superior, not only in terms of performance, but also in terms of structural complexity, to those obtained in our previous works, where individual or aggregated neural networks were used, but the stacks were developed manually, based on successive trials.