Can Evolution Strategies Improve Learning Guidance in XCS? Design and Comparison with Genetic Algorithms based XCS

  • Authors:
  • Sergio Morales-Ortigosa;Albert Orriols-Puig;Ester Bernadó-Mansilla

  • Affiliations:
  • Grup de Recerca en Sistemes Intel·ligents, Enginyeria i Arquitectura La Salle, Universitat Ramon Llull, Quatre Camins 2, 08022, Barcelona (Spain);Grup de Recerca en Sistemes Intel·ligents, Enginyeria i Arquitectura La Salle, Universitat Ramon Llull, Quatre Camins 2, 08022, Barcelona (Spain);Grup de Recerca en Sistemes Intel·ligents, Enginyeria i Arquitectura La Salle, Universitat Ramon Llull, Quatre Camins 2, 08022, Barcelona (Spain)

  • Venue:
  • Proceedings of the 2008 conference on Artificial Intelligence Research and Development: Proceedings of the 11th International Conference of the Catalan Association for Artificial Intelligence
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

XCS is a complex machine learning technique that combines credit apportionment techniques for rule evaluation with genetic algorithms for rule discovery to evolve a distributed set of sub-solutions online. Recent research on XCS has mainly focused on achieving a better understanding of the reinforcement component, yielding several improvements to the architecture. Nonetheless, studies on the rule discovery component of the system are scarce. In this paper, we experimentally study the discovery component of XCS, which is guided by a steady-state genetic algorithm. We design a new procedure based on evolution strategies and adapt it to the system. Then, we compare in detail XCS with both genetic algorithms and evolution strategies on a large collection of real-life problems, analyzing in detail the interaction of the different genetic operators and their contribution in the search for better rules. The overall analysis shows the competitiveness of the new XCS based on evolution strategies and increases our understanding of the behavior of the different genetic operators in XCS.