Toward Machine Learning Through Genetic Code-like Transformations

  • Authors:
  • Hillol Kargupta;Samiran Ghosh

  • Affiliations:
  • Computer Science and Electrical Engineering Department, University of Maryland Baltimore County, Baltimore, MD 21250 hillol@cs.umbc.edu;Computer Science and Electrical Engineering Department, University of Maryland Baltimore County, Baltimore, MD 21250 sghosh1@cs.umbc.edu

  • Venue:
  • Genetic Programming and Evolvable Machines
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

The gene expression process in nature involves several representation transformations of the genome. Translation is one among them; it constructs the amino acid sequence in proteins from the nucleic acid-based mRNA sequence. Translation is defined by a code book, known as the universal genetic code. This paper explores the role of genetic code and similar representation transformations for enhancing the performance of inductive machine learning algorithms. It considers an abstract model of genetic code-like transformations (GCTs) introduced elsewhere [21] and develops the notion of randomized GCTs. It shows that randomized GCTs can construct a representation of the learning problem where the mean-square-error surface is almost convex quadratic and therefore easier to minimize. It considers the functionally complete Fourier representation of Boolean functions to analyze this effect of such representation transformations. It offers experimental results to substantiate this claim. It shows that a linear classifier like the Perceptron [38] can learn non-linear XOR and DNF functions using a gradient-descent algorithm in a representation constructed by randomized GCTs. The paper also discusses the immediate challenges that must be solved before the proposed technique can be used as a viable approach for representation construction in machine learning.