Error minimized extreme learning machine with growth of hidden nodes and incremental learning

  • Authors:
  • Guorui Feng;Guang-Bin Huang;Qingping Lin;Robert Gay

  • Affiliations:
  • School of Communication and Information Engineering, Shanghai University, Shanghai, China;School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore, Singapore;School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore, Singapore;School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore, Singapore

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of the open problems in neural network research is how to automatically determine network architectures for given applications. In this brief, we propose a simple and efficient approach to automatically determine the number of hidden nodes in generalized single-hidden-layer feedforward networks (SLFNs) which need not be neural alike. This approach referred to as error minimized extreme learning machine (EM-ELM) can add random hidden nodes to SLFNs one by one or group by group (with varying group size). During the growth of the networks, the output weights are updated incrementally. The convergence of this approach is proved in this brief as well. Simulation results demonstrate and verify that our new approach is much faster than other sequential/incremental/ growing algorithms with good generalization performance.