Finding textures by textual descriptions, visual examples, and relevance feedbacks

  • Authors:
  • Hsin-Chih Lin;Chih-Yi Chiu;Shi-Nine Yang

  • Affiliations:
  • Department of Information Management, Chang Jung Christian University, 396 Chang Jung RD., Sec. 1, Tainan County 711, Taiwan;Department of Computer Science, National Tsing Hua University, Hsinchu 300, Taiwan;Department of Computer Science, National Tsing Hua University, Hsinchu 300, Taiwan

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2003

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this study, we propose a fuzzy logic CBIR (content-based image retrieval) system for finding textures. In our CBIR system, a user can submit textual descriptions and/or visual examples to find the desired textures. After the initial search, the user can give relevant and/or irrelevant examples to refine the query and improve the retrieval efficiency. Contributions of this study are fourfold. (1) Our CBIR system maps low-level statistical features to high-level textual concepts; it bridges the semantic gap between these two levels. (2) Our CBIR system characterizes texture properties of these two levels; and further, it achieves high-level texture manipulations through textual concepts. (3) Our CBIR system models the human perception subjectivity via relevance feedbacks to perform more accurate retrieval. (4) Our CBIR system provides intuitive and simple methods of similarity definitions and computations. Experimental results reveal our CBIR system is indeed effective. The retrieved images are perceptually satisfactory, and the retrieval time is very short.