Searching for diverse, cooperative populations with genetic algorithms

  • Authors:
  • Robert E. Smith;Stephanie Forrest;Alan S. Perelson

  • Affiliations:
  • Dept. of Engineering Mechanics, University of Alabama, Tuscaloosa, AL 35487 rob@comec4.mh.ua.edu;Dept. of Computer Science, University of New Mexico, Albuquerque, NM 87131 forrest@cs.unm.edu;Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87545 asp@receptor.lanl.gov

  • Venue:
  • Evolutionary Computation
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

In typical applications, genetic algorithms (GAs) process populations of potential problem solutions to evolve a single population member that specifies an 'optimized' solution. The majority of GA analysis has focused on these optimization applications. In other applications (notably learning classifier systems and certain connectionist learning systems), a GA searches for a population of cooperative structures that jointly perform a computational task. This paper presents an analysis of this type of GA problem. The analysis considers a simplified genetics-based machine learning system: a model of an immune system. In this model, a GA must discover a set of pattern-matching antibodies that effectively match a set of antigen patterns. Analysis shows how a GA can automatically evolve and sustain a diverse, cooperative population. The cooperation emerges as a natural part of the antigen-antibody matching procedure. This emergent effect is shown to be similar to fitness sharing, an explicit technique for multimodal GA optimization. Further analysis shows how the GA population can adapt to express various degrees of generalization. The results show how GAs can automatically and simultaneously discover effective groups of cooperative computational structures.