Matching and retrieval based on the vocabulary and grammar of color patterns

  • Authors:
  • A. Mojsilovic;J. Kovacevic;Jianying Hu;R. J. Safranek;S. K. Ganapathy

  • Affiliations:
  • Lucent Technol., Bell Labs., Murray Hill, NJ;-;-;-;-

  • Venue:
  • IEEE Transactions on Image Processing
  • Year:
  • 2000

Quantified Score

Hi-index 0.02

Visualization

Abstract

Proposes a perceptually based system for pattern retrieval and matching. The central idea is that similarity judgment has to be modeled along perceptual dimensions. Hence, we detect basic visual categories that people use in their judgment of similarity, and design a computational model that accepts patterns as input and, depending on the query, produces a set of choices that follow human behavior in pattern matching. There are two major research aspects to our work. The first one addresses the issue of how humans perceive and measure similarity within the domain of color patterns. To understand and describe this mechanism, we performed a subjective experiment which yielded five perceptual criteria used in comparison between color patterns (vocabulary), as well as a set of rules governing the use of these criteria in similarity judgment (grammar). The second research aspect is the implementation of the perceptual criteria and rules in an image retrieval system. Following the processing typical for human vision, we design a system to: (1) extract perceptual features from the vocabulary and (2) perform the comparison between the patterns according to the grammar rules. The modeling of human perception of color patterns is new - starting with a new color codebook design, compact color representation, and texture description through multi-scale edge distribution along different directions. Moreover, we propose new color and texture distance functions that correlate with human performance. The performance of the system is illustrated with numerous examples from image databases from different application domains