Multilayer feedforward networks are universal approximators
Neural Networks
SA-Prop: Optimization of Multilayer Perceptron Parameters Using Simulated Annealing
IWANN '99 Proceedings of the International Work-Conference on Artificial and Natural Neural Networks: Foundations and Tools for Neural Modeling
Practical Genetic Algorithms with CD-ROM
Practical Genetic Algorithms with CD-ROM
Differential Evolution: In Search of Solutions (Springer Optimization and Its Applications)
Differential Evolution: In Search of Solutions (Springer Optimization and Its Applications)
A Combined Genetic Algorithm and Orthogonal Transformation for Designing Feedforward Neural Networks
ICNC '07 Proceedings of the Third International Conference on Natural Computation - Volume 01
Rapid and brief communication: Evolutionary extreme learning machine
Pattern Recognition
OP-ELM: optimally pruned extreme learning machine
IEEE Transactions on Neural Networks
Hybrid Taguchi-genetic algorithm for global numerical optimization
IEEE Transactions on Evolutionary Computation
Tuning of the structure and parameters of a neural network using an improved genetic algorithm
IEEE Transactions on Neural Networks
Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm
IEEE Transactions on Neural Networks
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper proposes a learning framework for single-hidden layer feedforward neural networks (SLFN) called optimized extreme learning machine (O-ELM). In O-ELM, the structure and the parameters of the SLFN are determined using an optimization method. The output weights, like in the batch ELM, are obtained by a least squares algorithm, but using Tikhonov's regularization in order to improve the SLFN performance in the presence of noisy data. The optimization method is used to the set of input variables, the hidden-layer configuration and bias, the input weights and Tikhonov's regularization factor. The proposed framework has been tested with three optimization methods (genetic algorithms, simulated annealing, and differential evolution) over 16 benchmark problems available in public repositories.