A method to improve the performance of multilayer perceptron by utilizing various activation functions in the last hidden layer and the least squares method

  • Authors:
  • Krzysztof Halawa

  • Affiliations:
  • Wroclaw University of Technology, Wroclaw, Poland 50-370

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article presents a fast and uncomplicated method to modify multilayer perceptrons allowing for a considerable single-step reduction of the cost function which in this case is the mean of squared errors. The method consists in, but is not limited to the change of neuron activation functions in the last hidden layer and in the single application of the least squares method. No changes are made to neuron weights in any hidden layer. Some essential strong points of the method lie in the fact that it can be used to improve operation of networks trained earlier and the learning process need not be started from the very beginning.