Empirical Bayes for Learning to Learn
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
The Journal of Machine Learning Research
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
A nonparametric hierarchical bayesian framework for information filtering
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Variational methods for the Dirichlet process
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Empirical analysis of predictive algorithms for collaborative filtering
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Dirichlet enhanced relational learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Hi-index | 0.00 |
In this chapter, we address the situation where agents need to learn from one another by exchanging learned knowledge. We employ hierarchical Bayesian modelling, which provides a powerful and principled solution. We point out some shortcomings of parametric hierarchical Bayesian modelling and thus focus on a nonparametric approach. Nonparametric hierarchical Bayesian modelling has its roots in Bayesian statistics and, in the form of Dirichlet process mixture modelling, was recently introduced into the machine learning community. In this chapter, we hope to provide an accessible introduction to this particular branch of statistics. We present the standard sampling-based learning algorithms and introduce a particular EM learning approach that leads to efficient and plausible solutions. We illustrate the effectiveness of our approach in context of a recommendation engine where our approach allows the principled combination of content-based and collaborative filtering.