A novel selection evolutionary strategy for constrained optimization

  • Authors:
  • Licheng Jiao;Lin Li;Ronghua Shang;Fang Liu;Rustam Stolkin

  • Affiliations:
  • Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education of China, Xidian University, Xi'an, China;Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education of China, Xidian University, Xi'an, China;Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education of China, Xidian University, Xi'an, China;Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education of China, Xidian University, Xi'an, China;School of Computer Science, University of Birmingham, Edgbaston, Birmingham B15 2TT, UK

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 2013

Quantified Score

Hi-index 0.07

Visualization

Abstract

The existence of infeasible solutions makes it very difficult to handle constrained optimization problems (COPs) in a way that ensures efficient, optimal and constraint-satisfying convergence. Although further optimization from feasible solutions will typically lead in a direction that generates further feasible solutions, certain infeasible solutions can also provide useful information about the optimal direction of improvement for the objective function. How well an algorithm makes use of these two solutions determines its performance on COPs. This paper proposes a novel selection evolutionary strategy (NSES) for constrained optimization. A self-adaptive selection method is introduced to exploit both informative infeasible and feasible solutions from a perspective of combining feasibility with multi-objective problem (MOP) techniques. Since the global optimal solution of a COP is a feasible non-dominated solution, both non-dominated solutions with low constraint violation and feasible ones with low objective values are beneficial to an evolution process. Thus, the exploration and exploitation of both of these two kinds of solutions are preferred during the selection procedure. Several theorems and properties are given to prove the above assertion. Furthermore, the performance of our method is evaluated using 22 well-known benchmark functions. Experimental results show that the proposed method outperforms state-of-the-art algorithms in terms of the speed of finding feasible solutions and the stability of converging to global optimal solutions. In particular, when dealing with problems that have zero feasibility ratios and more than one active constraint, our method provides feasible solutions within fewer fitness evaluations (FES) and converges to the optimal solutions more reliably than other popular methods from the literature.