Internal model generation for evolutionary acceleration of automated robotic assembly optimization

  • Authors:
  • Jeremy A. Marvel;Wyatt S. Newman

  • Affiliations:
  • Case Western Reserve University, Cleveland, OH;Case Western Reserve University, Cleveland, OH

  • Venue:
  • PerMIS '09 Proceedings of the 9th Workshop on Performance Metrics for Intelligent Systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

While machine learning algorithms have been successfully applied to a myriad of task configurations for parameter optimization, without the benefit of a virtual representation to permit offline training, the learning process can be costly in terms of time being spent and components being worn or broken. Parameter spaces for which the model is not known or are too complex to simulate stand to benefit from the generation of model approximations to reduce the evaluation overhead. In this paper, we describe a computational learning approach for dynamically generating internal models for Genetic Algorithms (GA) performance optimization. Through the process of exploring the parameter gene pool, a stochastic search method can effectively build a virtual model of the task space and improve the performance of the learning process. Experiments demonstrate that, in the presence of noise, neural network abstractions of the mappings of sequence parameters to their resulting performances can effectively enhance the performance of stochastic parameter optimization techniques. And results are presented that illustrate the benefits of internal model building as it pertains to simulated experiments of complex problems and to physical trials in robot assembly utilizing an industrial robotic arm to put together an aluminum puzzle.