Coupled semi-supervised learning for information extraction
Proceedings of the third ACM international conference on Web search and data mining
Semi-supervised projection clustering with transferred centroid regularization
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Relevant subtask learning by constrained mixture models
Intelligent Data Analysis
Neurocomputing
A compression-based dissimilarity measure for multi-task clustering
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
Multi-task regularization of generative similarity models
SIMBAD'11 Proceedings of the First international conference on Similarity-based pattern recognition
Semi-supervised classification by probabilistic relaxation
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
A case study on meta-generalising: a Gaussian processes approach
The Journal of Machine Learning Research
Semi-supervised learning from a translation model between data distributions
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Hi-index | 0.14 |
Context plays an important role when performing classification, and in this paper we examine context from two perspectives. First, the classification of items within a single task is placed within the context of distinct concurrent or previous classification tasks (multiple distinct data collections). This is referred to as multi-task learning (MTL), and is implemented here in a statistical manner, using a simplified form of the Dirichlet process. In addition, when performing many classification tasks one has simultaneous access to all unlabeled data that must be classified, and therefore there is an opportunity to place the classification of any one feature vector within the context of all unlabeled feature vectors; this is referred to as semi-supervised learning. In this paper we integrate MTL and semi-supervised learning into a single framework, thereby exploiting two forms of contextual information. Example results are presented on a "toy" example, to demonstrate the concept, and the algorithm is also applied to three real data sets.