Enriching user profiling with affective features for the improvement of a multimodal recommender system

  • Authors:
  • Ioannis Arapakis;Yashar Moshfeghi;Hideo Joho;Reede Ren;David Hannah;Joemon M. Jose

  • Affiliations:
  • University of Glasgow, Lilybank Gardens, Glasgow;University of Glasgow, Lilybank Gardens, Glasgow;University of Glasgow, Lilybank Gardens, Glasgow;University of Glasgow, Lilybank Gardens, Glasgow;University of Glasgow, Lilybank Gardens, Glasgow;University of Glasgow, Lilybank Gardens, Glasgow

  • Venue:
  • Proceedings of the ACM International Conference on Image and Video Retrieval
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recommender systems have been systematically applied in industry and academia to help users cope with information uncertainty. However, given the multiplicity of the preferences and needs it has been shown that no approach is suitable for all users in all situations. Thus, it is believed that an effective recommender system should incorporate a variety of techniques and features to offer valuable recommendations and enhance the search experience. In this paper we propose a novel video search interface that employs a multimodal recommender system, which can predict topical relevance. The multimodal recommender accounts for interaction data, contextual information, as well as users' affective responses, and exploits these information channels to provide meaningful recommendations of unseen videos. Our experiment shows that the multimodal interaction feature is a promising way to improve the performance of recommendation.