Do clicks measure recommendation relevancy?: an empirical user study

  • Authors:
  • Hua Zheng;Dong Wang;Qi Zhang;Hang Li;Tinghao Yang

  • Affiliations:
  • Hulu, Beijing, China;Hulu, Beijing, China;Hulu, Beijing, China;Hulu, Beijing, China;Hulu, Beijing, China

  • Venue:
  • Proceedings of the fourth ACM conference on Recommender systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Evaluation have been an important subject since the early days of recommender systems. In online test, the click-through rate (CTR) is often adopted as the metric. However, recommended items with higher CTR does not imply higher relevance of two items since factors like item popularity or item serendipity may influence user's click behavior. We argue that the relevance of recommendation system is also desirable in many real applications. Here relevant means relevance in a human perceptible way. Relevant recommendations not only increase the users' trust to the system but are extremely useful for the vast number of anonymous user as their recommendations may only be made based on the current item. In this paper, we empirically examine the relation between the relevance of recommendations and the corresponding CTR with a few representative ItemCF algorithms through online data from a TV show/movie website, Hulu. Experiments show that algorithms with higher overall CTR may not correspond to higher relevance. Thus CTR may not be the optimal metric for online evaluation of recommender systems if producing relevant recommendations is of importance.