Multilayer feedforward networks are universal approximators
Neural Networks
On the Problem of Local Minima in Backpropagation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving the convergence of the back-propagation algorithm
Neural Networks
An adaptive conjugate gradient learning algorithm for efficient training of neural networks
Applied Mathematics and Computation
On the alleviation of the problem of local minima in back-propagation
Proceedings of second world congress on Nonlinear analysts
Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Feedforward Neural Network Methodology
Feedforward Neural Network Methodology
Evolutionary and Adaptive Computing in Engineering Design: The Integration of Adaptive Search Exploration and Optimization with Engineering Design Pro
Unsupervised learning in neural computation
Theoretical Computer Science - Natural computing
Biogeography-based optimization of neuro-fuzzy system parameters for diagnosis of cardiac disease
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Analytical and numerical comparisons of biogeography-based optimization and genetic algorithms
Information Sciences: an International Journal
Sparsely connected neural network-based time series forecasting
Information Sciences: an International Journal
Information Sciences: an International Journal
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Biogeography-Based Optimization
IEEE Transactions on Evolutionary Computation
Tuning of the structure and parameters of a neural network using an improved genetic algorithm
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
Variations of biogeography-based optimization and Markov analysis
Information Sciences: an International Journal
A survey on optimization metaheuristics
Information Sciences: an International Journal
An analysis of the migration rates for biogeography-based optimization
Information Sciences: an International Journal
Neurocomputing
Advances in Engineering Software
Hi-index | 0.07 |
The Multi-Layer Perceptron (MLP), as one of the most-widely used Neural Networks (NNs), has been applied to many practical problems. The MLP requires training on specific applications, often experiencing problems of entrapment in local minima, convergence speed, and sensitivity to initialization. This paper proposes the use of the recently developed Biogeography-Based Optimization (BBO) algorithm for training MLPs to reduce these problems. In order to investigate the efficiencies of BBO in training MLPs, five classification datasets, as well as six function approximation datasets are employed. The results are compared to five well-known heuristic algorithms, Back Propagation (BP), and Extreme Learning Machine (ELM) in terms of entrapment in local minima, result accuracy, and convergence rate. The results show that training MLPs by using BBO is significantly better than the current heuristic learning algorithms and BP. Moreover, the results show that BBO is able to provide very competitive results in comparison with ELM.