Computing similarity between cultural heritage items using multimodal features

  • Authors:
  • Nikolaos Aletras;Mark Stevenson

  • Affiliations:
  • University of Sheffield, Sheffield, UK;University of Sheffield, Sheffield, UK

  • Venue:
  • LaTeCH '12 Proceedings of the 6th Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

A significant amount of information about Cultural Heritage artefacts is now available in digital format and has been made available in digital libraries. Being able to identify items that are similar would be useful for search and navigation through these data sets. Information about items in these repositories is often multimodal, such as pictures of the artefact and an accompanying textual description. This paper explores the use of information from these various media for computing similarity between Cultural Heritage artefacts. Results show that combining information from images and text produces better estimates of similarity than when only a single medium is considered.