CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
Semi-supervised projection clustering with transferred centroid regularization
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Neurocomputing
A bayesian framework for learning shared and individual subspaces from multiple data sources
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part I
A compression-based dissimilarity measure for multi-task clustering
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
Multi-task clustering via domain adaptation
Pattern Recognition
Cross-Guided Clustering: Transfer of Relevant Supervision across Tasks
ACM Transactions on Knowledge Discovery from Data (TKDD)
Linear semi-supervised projection clustering by transferred centroid regularization
Journal of Intelligent Information Systems
Regularized nonnegative shared subspace learning
Data Mining and Knowledge Discovery
Triplex transfer learning: exploiting both shared and distinct concepts for text classification
Proceedings of the sixth ACM international conference on Web search and data mining
Multi-task learning with one-class SVM
Neurocomputing
Hi-index | 0.00 |
There are many clustering tasks which are closely related in the real world, e.g. clustering the web pages of different universities. However, existing clustering approaches neglect the underlying relation and treat these clustering tasks either individually or simply together. In this paper, we will study a novel clustering paradigm, namely multi-task clustering, which performs multiple related clustering tasks together and utilizes the relation of these tasks to enhance the clustering performance. We aim to learn a subspace shared by all the tasks, through which the knowledge of the tasks can be transferred to each other. The objective of our approach consists of two parts: (1) Within-task clustering: clustering the data of each task in its input space individually; and (2) Cross-task clustering: simultaneous learning the shared subspace and clustering the data of all the tasks together. We will show that it can be solved by alternating minimization, and its convergence is theoretically guaranteed. Furthermore, we will show that given the labels of one task, our multi-task clustering method can be extended to transductive transfer classification (a.k.a. cross-domain classification, domain adaption). Experiments on several cross-domain text data sets demonstrate that the proposed multi-task clustering outperforms traditional single-task clustering methods greatly. And the transductive transfer classification method is comparable to or even better than several existing transductive transfer classification approaches.