Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Improving Automatic Query Classification via Semi-Supervised Learning
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Co-clustering based classification for out-of-domain documents
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Proceedings of the 17th international conference on World Wide Web
Knowledge transfer via multiple model local structure mapping
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Spectral domain-transfer learning
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Domain Adaptation of Conditional Probability Models Via Feature Subsetting
PKDD 2007 Proceedings of the 11th European conference on Principles and Practice of Knowledge Discovery in Databases
A Hilbert Space Embedding for Distributions
ALT '07 Proceedings of the 18th international conference on Algorithmic Learning Theory
Transfer learning from multiple source domains via consensus regularization
Proceedings of the 17th ACM conference on Information and knowledge management
Domain adaptation from multiple sources via auxiliary classifiers
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Extracting discriminative concepts for domain adaptation in text mining
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Cross domain distribution adaptation via kernel mapping
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Domain adaptation with structural correspondence learning
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
Transfer learning via dimensionality reduction
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Large margin transductive transfer learning
Proceedings of the 18th ACM conference on Information and knowledge management
Domain Adaptation Problems: A DASVM Classification Technique and a Circular Validation Strategy
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Least-squares Approach to Direct Importance Estimation
The Journal of Machine Learning Research
On minimum class locality preserving variance support vector machine
Pattern Recognition
Bridging Domains Using World Wide Knowledge for Transfer Learning
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Knowledge and Data Engineering
Hilbert Space Embeddings and Metrics on Probability Measures
The Journal of Machine Learning Research
Predictive distribution matching SVM for multi-domain learning
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part I
Location and Scatter Matching for Dataset Shift in Text Mining
ICDM '10 Proceedings of the 2010 IEEE International Conference on Data Mining
Domain Transfer Multiple Kernel Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Orthogonal Laplacianfaces for Face Recognition
IEEE Transactions on Image Processing
Minimum Class Variance Support Vector Machines
IEEE Transactions on Image Processing
Hi-index | 0.01 |
Domain adaptation learning (DAL) is a novel and effective technique to address pattern classification problems where the prior information for training is unavailable or insufficient. Its effectiveness depends on the discrepancy between the two distributions that respectively generate the training data for the source domain and the testing data for the target domain. However, DAL may not work so well when only the distribution mean discrepancy between source and target domains is considered and minimized. In this paper, we first construct a generalized projected maximum distribution discrepancy (GPMDD) metric for DAL on reproducing kernel Hilbert space (RKHS) based domain distributions by simultaneously considering both the projected maximum distribution mean and the projected maximum distribution scatter discrepancy between the source and the target domain. In the sequel, based on both the structure risk and the GPMDD minimization principle, we propose a novel domain adaptation kernelized support vector machine (DAKSVM) with respect to the classical SVM, and its two extensions called LS-DAKSVM and @m-DAKSVM with respect to the least-square SVM and the v-SVM, respectively. Moreover, our theoretical analysis justified that the proposed GPMDD metric could effectively measure the consistency not only between the RKHS embedding domain distributions but also between the scatter information of source and target domains. Hence, the proposed methods are distinctive in that the more consistency between the scatter information of source and target domains can be achieved by tuning the kernel bandwidth, the better the convergence of GPMDD metric minimization is and thus improving the scalability and generalization capability of the proposed methods for DAL. Experimental results on artificial and real-world problems indicate that the performance of the proposed methods is superior to or at least comparable with existing benchmarking methods.