IEEE Transactions on Pattern Analysis and Machine Intelligence
Deterministic neural classification
Neural Computation
Sales forecasting using extreme learning machine with applications in fashion retailing
Decision Support Systems
Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
Rapid and brief communication: Evolutionary extreme learning machine
Pattern Recognition
OP-ELM: optimally pruned extreme learning machine
IEEE Transactions on Neural Networks
Engineering Applications of Artificial Intelligence
Two-stage extreme learning machine for regression
Neurocomputing
Hi-index | 0.00 |
In recent years, Extreme Learning Machine (ELM) has attracted comprehensive attentions as a universal function approximator. Comparing to other single layer feedforward neural networks, its input parameters of hidden neurons can be randomly generated rather than tuned, and thereby saving a huge amount of computational power. However, it has been pointed out that the randomness of ELM parameters would result in fluctuating performances. In this paper, we intensively investigate the randomness reduction effect by using a regularized version of ELM, named Ridge ELM (RELM). Previously, RELM has been shown to achieve generally better generalization than the original ELM. Furthermore, we try to demonstrate that RELM can also greatly reduce the fluctuating performance with 12 real world regression tasks. An insight into this randomness reduction effect is also given.