Journal of Global Optimization
Differential Evolution Training Algorithm for Feed-Forward Neural Networks
Neural Processing Letters
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Computational Intelligence and Security
The particle swarm - explosion, stability, and convergence in amultidimensional complex space
IEEE Transactions on Evolutionary Computation
Tuning of the structure and parameters of a neural network using an improved genetic algorithm
IEEE Transactions on Neural Networks
Mutation-based genetic neural network
IEEE Transactions on Neural Networks
Human group optimizer with local search
ICSI'11 Proceedings of the Second international conference on Advances in swarm intelligence - Volume Part I
Modified seeker optimization algorithm for unconstrained optimization problems
AMERICAN-MATH'12/CEA'12 Proceedings of the 6th WSEAS international conference on Computer Engineering and Applications, and Proceedings of the 2012 American conference on Applied Mathematics
Hi-index | 0.01 |
Seeker optimization algorithm (SOA) is a novel population-based heuristic stochastic search algorithm, which is based on the concept of simulating the act of human searching. In the SOA, the search direction is determined by seeker's egotistic behavior, altruistic behavior and pro-activeness behavior, while step length is given by uncertainty reasoning behavior. In this paper, the application of the SOA to tuning the structures and parameters of artificial neural networks (ANNs) is presented as a new evolutionary method of ANN training. Simulation experiments for pattern classification and function approximation are performed. The comparisons of the SOA between BP algorithms and other evolutionary algorithms (EAs) are studied. The simulation results show that the performance of the SOA is better than or, at least, equivalent to that of other EAs (i.e., DE and two variations of PSO) for all the listed problems. Moreover, the ANNs with link switches trained by the SOA can provide better or comparable learning capabilities with much less number of links than ones by BP algorithms (i.e., GDX, RP, OSS and SCG). Hence, SOA can simultaneously tune the structures and the weight values, and, though SOA is more computationally intensive, it is believed that SOA will become a promising candidate for training ANNs.