Machine Learning - Special issue on inductive transfer
Hidden Markov Model} Induction by Bayesian Model Merging
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Hidden Markov models for multiaspect target classification
IEEE Transactions on Signal Processing
Improved machine learning models for predicting selective compounds
Proceedings of the 2nd ACM Conference on Bioinformatics, Computational Biology and Biomedicine
Generative Models for Evolutionary Clustering
ACM Transactions on Knowledge Discovery from Data (TKDD)
Paralinguistics in speech and language-State-of-the-art and the challenge
Computer Speech and Language
Hi-index | 0.00 |
A new hierarchical nonparametric Bayesian model is proposed for the problem of multitask learning (MTL) with sequential data. Sequential data are typically modeled with a hidden Markov model (HMM), for which one often must choose an appropriate model structure (number of states) before learning. Here we model sequential data from each task with an infinite hidden Markov model (iHMM), avoiding the problem of model selection. The MTL for iHMMs is implemented by imposing a nested Dirichlet process (nDP) prior on the base distributions of the iHMMs. The nDP-iHMM MTL method allows us to perform task-level clustering and data-level clustering simultaneously, with which the learning for individual iHMMs is enhanced and between-task similarities are learned. Learning and inference for the nDP-iHMM MTL are based on a Gibbs sampler. The effectiveness of the framework is demonstrated using synthetic data as well as real music data.