Improving top-n recommendation techniques using rating variance
Proceedings of the 2008 ACM conference on Recommender systems
Imputed Neighborhood Based Collaborative Filtering
WI-IAT '08 Proceedings of the 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology - Volume 01
A survey of collaborative filtering techniques
Advances in Artificial Intelligence
Increasing temporal diversity with purchase intervals
SIGIR '12 Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
The efficient imputation method for neighborhood-based collaborative filtering
Proceedings of the 21st ACM international conference on Information and knowledge management
A new approach for collaborative filtering based on mining frequent itemsets
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part II
Attribute-based collaborative filtering using genetic algorithm and weighted C-means algorithm
International Journal of Business Information Systems
International Journal of Business Information Systems
Exploiting two-faceted web of trust for enhanced-quality recommendations
Expert Systems with Applications: An International Journal
Collaborative filtering based on rating psychology
WAIM'13 Proceedings of the 14th international conference on Web-Age Information Management
Hi-index | 0.00 |
As one of the most successful recommender systems, collaborative filtering (CF) algorithms can deal with high sparsity and high requirement of scalability amongst other challenges. Bayesian belief nets (BNs), one of the most frequently used classifiers, can be used for CF tasks. Previous works of applying BNs to CF tasks were mainly focused on binary-class data, and used simple or basic Bayesian classifiers [1][2]. In this work, we apply advanced BNs models to CF tasks instead of simple ones, and work on real-world multi-class CF data instead of synthetic binary-class data. Empirical results show that with their ability to deal with incomplete data, extended logistic regression on naïve Bayes and tree augmented naïve Bayes (NB-ELR and TAN-ELR) models [3] consistently perform better than the state-of-the-art Pearson correlation-based CF algorithm. In addition, the ELR-optimized BNs CF models are robust in terms of the ability to make predictions, while the robustness of the Pearson correlation-based CF algorithm degrades as the sparseness of the data increases.