A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning - Special issue on inductive transfer
Automatic Indexing: An Experimental Inquiry
Journal of the ACM (JACM)
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Improving SVM accuracy by training on auxiliary data sources
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Bayesian hierarchical clustering
ICML '05 Proceedings of the 22nd international conference on Machine learning
A model of inductive bias learning
Journal of Artificial Intelligence Research
Generalizing plans to new environments in relational MDPs
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Knowledge transfer via multiple model local structure mapping
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
An Information-Theoretic Approach for Multi-task Learning
ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
Transfer Learning for Reinforcement Learning Domains: A Survey
The Journal of Machine Learning Research
Activity knowledge transfer in smart environments
Pervasive and Mobile Computing
Paralinguistics in speech and language-State-of-the-art and the challenge
Computer Speech and Language
Hi-index | 0.00 |
In this paper, we show how using the Dirichlet Process mixture model as a generative model of data sets provides a simple and effective method for transfer learning. In particular, we present a hierarchical extension of the classic Naive Bayes classifier that couples multiple Naive Bayes classifiers by placing a Dirichlet Process prior over their parameters and show how recent advances in approximate inference in the Dirichlet Process mixture model enable efficient inference. We evaluate the resulting model in a meeting domain, in which the system decides, based on a learned model of the user's behavior, whether to accept or reject the request on his or her behalf. The extended model outperforms the standard Naive Bayes model by using data from other users to influence its predictions.