A study on the randomness reduction effect of extreme learning machine with ridge regression

  • Authors:
  • Meng Joo Er;Zhifei Shao;Ning Wang

  • Affiliations:
  • Dalian Maritime University, Dalian, China,Nanyang Technological University, Singapore;Nanyang Technological University, Singapore;Dalian Maritime University, Dalian, China

  • Venue:
  • ISNN'13 Proceedings of the 10th international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In recent years, Extreme Learning Machine (ELM) has attracted comprehensive attentions as a universal function approximator. Comparing to other single layer feedforward neural networks, its input parameters of hidden neurons can be randomly generated rather than tuned, and thereby saving a huge amount of computational power. However, it has been pointed out that the randomness of ELM parameters would result in fluctuating performances. In this paper, we intensively investigate the randomness reduction effect by using a regularized version of ELM, named Ridge ELM (RELM). Previously, RELM has been shown to achieve generally better generalization than the original ELM. Furthermore, we try to demonstrate that RELM can also greatly reduce the fluctuating performance with 12 real world regression tasks. An insight into this randomness reduction effect is also given.