Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation)
OP-ELM: Theory, Experiments and a Toolbox
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
A constructive enhancement for online sequential extreme learning machine
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Rapid and brief communication: Evolutionary extreme learning machine
Pattern Recognition
Two-stage extreme learning machine for regression
Neurocomputing
Evolutionary extreme learning machine – based on particle swarm optimization
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The extreme learning machine (ELM) is a methodology for learning single-hidden layer feedforward neural networks (SLFN) which has been proved to be extremely fast and to provide very good generalization performance. ELM works by randomly choosing the weights and biases of the hidden nodes and then analytically obtaining the output weights and biases for a SLFN with the number of hidden nodes previously fixed. In this work, we develop a multi-objective micro genetic ELM (@mG-ELM) which provides the appropriate number of hidden nodes for the problem being solved as well as the weights and biases which minimize the MSE. The multi-objective algorithm is conducted by two criteria: the number of hidden nodes and the mean square error (MSE). Furthermore, as a novelty, @mG-ELM incorporates a regression device in order to decide whether the number of hidden nodes of the individuals of the population should be increased or decreased or unchanged. In general, the proposed algorithm reaches better errors by also implying a smaller number of hidden nodes for the data sets and competitors considered.