Generalizing matrix factorization through flexible regression priors

  • Authors:
  • Liang Zhang;Deepak Agarwal;Bee-Chung Chen

  • Affiliations:
  • Yahoo! Labs, Santa Clara, CA, USA;Yahoo! Labs, Santa Clara, CA, USA;Yahoo! Labs, Santa Clara, CA, USA

  • Venue:
  • Proceedings of the fifth ACM conference on Recommender systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Predicting user "ratings" on items is a crucial task in recommender systems. Matrix factorization methods that computes a low-rank approximation of the incomplete user-item rating matrix provide state-of-the-art performance, especially for users and items with several past ratings (warm starts). However, it is a challenge to generalize such methods to users and items with few or no past ratings (cold starts). Prior work [4][32] have generalized matrix factorization to include both user and item features for performing better regularization of factors as well as provide a model for smooth transition from cold starts to warm starts. However, the features were incorporated via linear regression on factor estimates. In this paper, we generalize this process to allow for arbitrary regression models like decision trees, boosting, LASSO, etc. The key advantage of our approach is the ease of computing --- any new regression procedure can be incorporated by "plugging" in a standard regression routine into a few intermediate steps of our model fitting procedure. With this flexibility, one can leverage a large body of work on regression modeling, variable selection, and model interpretation. We demonstrate the usefulness of this generalization using the MovieLens and Yahoo! Buzz datasets.