Multi-task classification with infinite local experts

  • Authors:
  • Chunping Wang; Qi An;Lawrence Carin;David B. Dunson

  • Affiliations:
  • Department of Electrical and Computer Engineering, Duke University, Durham, NC 27708, USA;Department of Electrical and Computer Engineering, Duke University, Durham, NC 27708, USA;Department of Electrical and Computer Engineering, Duke University, Durham, NC 27708, USA;Department of Statistical Science, Duke University, Durham, NC 27708, USA

  • Venue:
  • ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a multi-task learning (MTL) framework for non-linear classification, based on an infinite set of local experts in feature space. The usage of local experts enables sharing at the expert-level, encouraging the borrowing of information even if tasks are similar only in subregions of feature space. A kernel stick-breaking process (KSBP) prior is imposed on the underlying distribution of class labels, so that the number of experts is inferred in the posterior and thus model selection issues are avoided. The MTL is implemented by imposing a Dirichlet process (DP) prior on a layer above the task-dependent KSBPs.