Automatic textile image annotation by predicting emotional concepts from visual features

  • Authors:
  • Yunhee Shin;Youngrae Kim;Eun Yi Kim

  • Affiliations:
  • Visual Information Processing Lab., Dept. of Advanced Technology Fusion, Konkuk Univ., Republic of Korea;Visual Information Processing Lab., Dept. of Advanced Technology Fusion, Konkuk Univ., Republic of Korea;Visual Information Processing Lab., Dept. of Advanced Technology Fusion, Konkuk Univ., Republic of Korea

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an emotion prediction system that can automatically predict certain human emotional concepts from a given textile. The main application motivating this study is textile image annotation, which has recently rapidly expanded in relation to the Web. In the proposed method, color and pattern are used as cues to predict the emotional semantics associated with an image, where these features are extracted using a color quantization and a multi-level wavelet transform, respectively. The extracted features are then applied to three representative classifiers: K-means clustering, Naive Bayesian, and a multi-layered perceptron (MLP), all of which are widely used in data mining. When evaluating the proposed emotion prediction method using 3600 textile images, the MLP produces the best performance. Thereafter, the proposed MLP-based method is compared with other methods that only use color or pattern, and the proposed method shows the best performance with an accuracy of above 92%. Therefore, the results confirm that the proposed method can be effectively applied to the commercial textile industry and image retrieval.