Modified Gram-Schmidt Algorithm for Extreme Learning Machine

  • Authors:
  • Jianchuan Yin;Fang Dong;Nini Wang

  • Affiliations:
  • -;-;-

  • Venue:
  • ISCID '09 Proceedings of the 2009 Second International Symposium on Computational Intelligence and Design - Volume 02
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.