Social event detection using multimodal clustering and integrating supervisory signals

  • Authors:
  • Georgios Petkos;Symeon Papadopoulos;Yiannis Kompatsiaris

  • Affiliations:
  • Informatics and Telematics Institute, Thessaloniki, Greece;Informatics and Telematics Institute, Thessaloniki, Greece;Informatics and Telematics Institute, Thessaloniki, Greece

  • Venue:
  • Proceedings of the 2nd ACM International Conference on Multimedia Retrieval
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

A large variety of features can be extracted from raw multimedia items. Moreover, in many contexts, like in the case of multimedia uploaded by users of social media platforms, items may be linked to metadata that can be very useful for a variety of analysis tasks. Nevertheless, such features are typically heterogeneous and are difficult to combine in a unified representation that would be suitable for analysis. In this paper, we discuss the problem of clustering collections of multimedia items with the purpose of detecting social events. In order to achieve this, a novel multimodal clustering algorithm is proposed. The proposed method uses a known clustering in the currently examined domain, in order to supervise the multimodal fusion and clustering procedure. It is tested on the MediaEval social event detection challenge data and is compared to a multimodal spectral clustering approach that uses early fusion. By taking advantage of the explicit supervisory signal, it achieves superior clustering accuracy and additionally requires the specification of a much smaller number of parameters. Moreover, the proposed approach has wider scope; it is not only applicable to the task of social event detection, but to other multimodal clustering problems as well.