Inferring decision trees using the minimum description length principle
Information and Computation
Machine Learning
Machine Learning - Special issue on inductive transfer
Toward Optimal Active Learning through Sampling Estimation of Error Reduction
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Towards parameter-free data mining
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
Boosting for transfer learning
Proceedings of the 24th international conference on Machine learning
An Algorithm for Transfer Learning in a Heterogeneous Environment
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Kernel-Based Inductive Transfer
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Actively Transfer Domain Knowledge
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Relaxed Transfer of Different Classes via Spectral Partition
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Transfer learning, feature selection and word sense disambguation
ACLShort '09 Proceedings of the ACL-IJCNLP 2009 Conference Short Papers
Extending Semi-supervised Learning Methods for Inductive Transfer Learning
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
IEEE Transactions on Knowledge and Data Engineering
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
A compression-based dissimilarity measure for multi-task clustering
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
Hi-index | 0.00 |
Transfer learning techniques have witnessed a significant development in real applications where the knowledge from previous tasks are required to reduce the high cost of inquiring the labeled information for the target task. However, how to avoid negative transfer which happens due to different distributions of tasks in heterogeneous environment is still a open problem. In order to handle this kind of issue, we propose a Compact Coding method for Hyperplane Classifiers (CCHC) under a two-level framework in inductive transfer learning setting. Unlike traditional methods, we measure the similarities among tasks from the macro level perspective through minimum encoding. Particularly speaking, the degree of the similarity is represented by the relevant code length of the class boundary of each source task with respect to the target task. In addition, informative parts of the source tasks are adaptively selected in the micro level viewpoint to make the choice of the specific source task more accurate. Extensive experiments show the effectiveness of our algorithm in terms of the classification accuracy in both UCI and text data sets.