Associating visual textures with human perceptions using genetic algorithms

  • Authors:
  • Werner Groissboeck;Edwin Lughofer;Stefan Thumfart

  • Affiliations:
  • Department of Knowledge-based Mathematical Systems, Johannes Kepler, University of Linz, A-4040 Linz, Austria;Department of Knowledge-based Mathematical Systems, Johannes Kepler, University of Linz, A-4040 Linz, Austria;Machine Vision Group, Profactor GmbH, Im Stadtgut A2, 4407 Steyr-Gleink, Austria

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 2010

Quantified Score

Hi-index 0.07

Visualization

Abstract

This paper deals with an approach allowing to associate visual textures with given human perceptions. Hereby, based on a forward model associating human perceptions for given visual textures, the deduction of an reverse process is presented which is able to associate and characterize visual textures for given human perceptions. For doing so, we propose a constraint-based genetic algorithm approach, which is able to minimize a specific optimization problem containing constraints in form of band-widths for valid individuals (low level features extracted from textures) in a population. The constraints are determined by relationships between (low level) features characterizing textures in form of high-dimensional approximation models. Additionally, in each iteration step checking for valid individuals is carried out with a texture/non-texture classifier or by using a convex hull over a set of valid textures. The whole approach is evaluated based on a real-world texture set used as a start population in the genetic algorithm and by defining various kinds of human perceptions (for which textures are sought) represented by adjective vectors in the aesthetic space. The generated individuals (low level feature vectors) have a high level of fitness (they are quite close to the pre-defined adjective vectors) and a small distance to the initial population. The textures synthesized based on the generated individuals are visualized and compared with textures synthesized by a time-intensive direct texture mixing and re-combination method based on a real-world texture data base.