A space search optimization algorithm with accelerated convergence strategies

  • Authors:
  • Wei Huang;Sung-Kwun Oh;Zhaolu Guo;Witold Pedrycz

  • Affiliations:
  • -;-;-;-

  • Venue:
  • Applied Soft Computing
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Evolutionary algorithms (EAs), which have been widely used to solve various scientific and engineering optimization problems, are essentially stochastic search algorithms operating in the overall solution space. However, such random search mechanism may lead to some disadvantages such as a long computing time and premature convergence. In this study, we propose a space search optimization algorithm (SSOA) with accelerated convergence strategies to alleviate the drawbacks of the purely random search mechanism. The overall framework of the SSOA involves three main search mechanisms: local space search, global space search, and opposition-based search. The local space search that aims to form new solutions approaching the local optimum is realized based on the concept of augmented simplex method, which exhibits significant search abilities realized in some local space. The global space search is completed by Cauchy searching, where the approach itself is based on the Cauchy mutation. This operation can help the method avoid of being trapped in local optima and in this way alleviate premature convergence. An opposition-based search is exploited to accelerate the convergence of space search. This operator can effectively reduce a substantial computational overhead encountered in evolutionary algorithms (EAs). With the use of them SSOA realizes an effective search process. To evaluate the performance of the method, the proposed SSOA is contrasted with a method of differential evolution (DE), which is a well-known space concept-based evolutionary algorithm. When tested against benchmark functions, the SSOA exhibits a competitive performance vis-a-vis performance of some other competitive schemes of differential evolution in terms of accuracy and speed of convergence, especially in case of high-dimensional continuous optimization problems.