Evolving visual routines

  • Authors:
  • Michael Patrick Johnson;Pattie Maes;Trevor Darrell

  • Affiliations:
  • -;-;-

  • Venue:
  • Artificial Life
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

Traditional machine vision assumes that the vision system recovers a complete, labeled description of the world [10]. Recently, several researchers have criticized this model and proposed an alternative model that considers perception as a distributed collection of task-specific, context-driven visual routines [1, 12]. Some of these researchers have argued that in natural living systems these visual routines are the product of natural selection [11]. So far, researchers have hand-coded task-specific visual routines for actual implementations (e.g., [3]). In this article we propose an alternative approach in which visual routines for simple tasks are created using an artificial evolution approach. We present results from a series of runs on actual camera images, in which simple routines were evolved using genetic programming techniques [7]. The results obtained are promising: The evolved routines are able to process correctly up to 93% of the test images, which is better than any algorithm we were able to write by hand.