Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
ELM-Based time-variant neural networks with incremental number of output basis functions
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
On-Line extreme learning machine for training time-varying neural networks
ICIC'11 Proceedings of the 7th international conference on Intelligent Computing: bio-inspired computing and applications
Hi-index | 0.00 |
Extreme Learning Machine (ELM) is a novel learning algorithm for Neural Networks (NN) much faster than the traditional gradient-based learning techniques, and many variants, extensions and applications in the NN field have been appeared in the recent literature. Among them, an ELM approach has been applied to training Time-Variant Neural Networks (TV-NN), with the main objective to reduce the training time. Moreover, interesting approaches have been proposed to automatically determine the number of hidden nodes, which represents one of the limitations of original ELM algorithm for NN. In this paper, we extend the Error Minimized Extreme Learning Machine (EMELM) algorithm along with other two incremental based ELM methods to the time-variant case study, which is actually missing in the related literature. Comparative simulation results show the the proposed EMELM-TV is efficient to optimally determine the basic network architecture guaranteeing good generalization performances at the same time.