Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
A Cooperative Coevolutionary Approach to Function Optimization
PPSN III Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature
Cooperative Coevolution: An Architecture for Evolving Coadapted Subcomponents
Evolutionary Computation
Large scale evolutionary optimization using cooperative coevolution
Information Sciences: an International Journal
A cooperative coevolutionary algorithm with correlation based adaptive variable partitioning
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Metaheuristics: From Design to Implementation
Metaheuristics: From Design to Implementation
Memetic algorithms for continuous optimisation based on local search chains
Evolutionary Computation
Large-scale global optimization using cooperative coevolution with variable interaction learning
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part II
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special Issue on scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems
Fast Distributed Algorithms for Computing Separable Functions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Decomposing a large scale problem into smaller subproblems is one of the approaches used to overcome the usual performance deterioration that occurs in EA because of the large dimensionality. To achieve a good performance with a decomposition approach, the dependent variables need to be grouped into the same subproblem. In this paper, the Hybrid Dependency Identification with Memetic Algorithm (HDIMA) model is proposed for large scale optimization problems. The Dependency Identification (DI) technique identifies the variables that must be grouped together to form the subproblems. These subproblems are then evolved using a Memetic Algorithm (MA). Before the end of the evolution process, the subproblems are then aggregated and optimized as a complete large scale problem. A newly designed test suite of problems has been used to evaluate the performance of HDIMA over different dimensions. The evaluation shows that HDIMA is competitive to other models in the literature in terms of both consuming less computational resources and better performance.