Learning internal representations
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Machine Learning - Special issue on inductive transfer
An introduction to variational methods for graphical models
Proceedings of the NATO Advanced Study Institute on Learning in graphical models
Making Rational Decisions Using Adaptive Utility Elicitation
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
The Journal of Machine Learning Research
A nonparametric hierarchical bayesian framework for information filtering
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Learning to learn with the informative vector machine
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Variational methods for the Dirichlet process
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Preference learning with Gaussian processes
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning Gaussian processes from multiple tasks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Preference learning for cognitive modeling: a case study on entertainment preferences
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Variational relevance vector machines
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Signal Modeling and Classification Using a Robust Latent Space Model Based on Distributions
IEEE Transactions on Signal Processing
Hi-index | 12.05 |
In this paper, we present a novel methodology for preference learning based on the concept of inductive transfer. Specifically, we introduce a nonparametric hierarchical Bayesian multitask learning approach, based on the notion that human subjects may cluster together forming groups of individuals with similar preference rationale (but not identical preferences). Our approach is facilitated by the utilization of a Dirichlet process prior, which allows for the automatic inference of the most appropriate number of subject groups (clusters), as well as the employment of the automatic relevance determination (ARD) mechanism, giving rise to a sparse nature for our model, which significantly enhances its computational efficiency. We explore the efficacy of our novel approach by applying it to both a synthetic experiment and a real-world music recommendation application. As we show, our approach offers a significant enhancement in the effectiveness of knowledge transfer in statistical preference learning applications, being capable of correctly inferring the actual number of human subject groups in a modeled dataset, and limiting knowledge transfer only to subjects belonging to the same group (wherein knowledge transferability is more likely).