Emotion related structures in large image databases

  • Authors:
  • Martin Solli;Reiner Lenz

  • Affiliations:
  • Linköping University, Norrköping, Sweden;Linköping University, Norrköping, Sweden

  • Venue:
  • Proceedings of the ACM International Conference on Image and Video Retrieval
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce two large databases consisting of 750 000 and 1.2 million thumbnail-sized images, labeled with emotion-related keywords. The smaller database consists of images from Matton Images, an image provider. The larger database consists of web images that were indexed by the crawler of the image search engine Picsearch. The images in the Picsearch database belong to one of 98 emotion related categories and contain meta-data in the form of secondary keywords, the originating website and some view statistics. We use two psycho-physics related feature vectors based on the emotional impact of color combinations, the standard RGB-histogram and two SIFT-related descriptors to characterize the visual properties of the images. These features are then used in two-class classification experiments to explore the discrimination properties of emotion-related categories. The clustering software and the classifiers are available in the public domain, and the same standard configurations are used in all experiments. Our findings show that for emotional categories, descriptors based on global image statistics (global histograms) perform better than local image descriptors (bag-of-words models). This indicates that content-based indexing and retrieval using emotion-based approaches are fundamentally different from the dominant object-recognition based approaches for which SIFT-related features are the standard descriptors.