Training multi-layer perceptrons using minimin approach

  • Authors:
  • Liefeng Bo;Ling Wang;Licheng Jiao

  • Affiliations:
  • Institute of Intelligent Information Processing, Xidian University, Xi’an, China;Institute of Intelligent Information Processing, Xidian University, Xi’an, China;Institute of Intelligent Information Processing, Xidian University, Xi’an, China

  • Venue:
  • CIS'05 Proceedings of the 2005 international conference on Computational Intelligence and Security - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our method for several big benchmark data sets.