Feature lattices for maximum entropy modelling
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
Learning to learn with the informative vector machine
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Journal of Machine Learning Research
A maximum entropy model for prepositional phrase attachment
HLT '94 Proceedings of the workshop on Human Language Technology
Multi-label informed latent semantic indexing
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Multi-labelled classification using maximum entropy method
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Image annotations by combining multiple evidence & wordNet
Proceedings of the 13th annual ACM international conference on Multimedia
Logistic regression with an auxiliary data source
ICML '05 Proceedings of the 22nd international conference on Machine learning
Correlated Label Propagation with Application to Multi-label Learning
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Boosting for transfer learning
Proceedings of the 24th international conference on Machine learning
Proceedings of the 25th international conference on Machine learning
Estimating labels from label proportions
Proceedings of the 25th international conference on Machine learning
Extracting shared subspace for multi-label classification
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
ACM Computing Surveys (CSUR)
Heterogeneous transfer learning for image clustering via the social web
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 1 - Volume 1
Unifying divergence minimization and statistical inference via convex duality
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Serendipitous learning: learning beyond the predefined label space
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Unsupervised multi-label text classification using a world knowledge ontology
PAKDD'12 Proceedings of the 16th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part I
Regularized nonnegative shared subspace learning
Data Mining and Knowledge Discovery
Predicting positive and negative links in signed social networks by transfer learning
Proceedings of the 22nd international conference on World Wide Web
Mapping semantic knowledge for unsupervised text categorisation
ADC '13 Proceedings of the Twenty-Fourth Australasian Database Conference - Volume 137
Hi-index | 0.00 |
We study the problem of building the classification model for a target class in the absence of any labeled training example for that class. To address this difficult learning problem, we extend the idea of transfer learning by assuming that the following side information is available: (i) a collection of labeled examples belonging to other classes in the problem domain, called the auxiliary classes; (ii) the class information including the prior of the target class and the correlation between the target class and the auxiliary classes. Our goal is to construct the classification model for the target class by leveraging the above data and information. We refer to this learning problem as unsupervised transfer classification. Our framework is based on the generalized maximum entropy model that is effective in transferring the label information of the auxiliary classes to the target class. A theoretical analysis shows that under certain assumption, the classification model obtained by the proposed approach converges to the optimal model when it is learned from the labeled examples for the target class. Empirical study on text categorization over four different data sets verifies the effectiveness of the proposed approach.