A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning - Special issue on inductive transfer
Pattern Recognition Letters
Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Anomaly Detection over Noisy Data using Learned Probability Distributions
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Empirical Bayes for Learning to Learn
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
An Approach to Novelty Detection Applied to the Classification of Image Regions
IEEE Transactions on Knowledge and Data Engineering
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-task feature and kernel selection for SVMs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Estimating the Support of a High-Dimensional Distribution
Neural Computation
An Improved Multi-task Learning Approach with Applications in Medical Diagnosis
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Convex multi-task feature learning
Machine Learning
Transferring multi-device localization models using latent multi-task learning
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
A multitask learning model for online pattern recognition
IEEE Transactions on Neural Networks
Learning the Shared Subspace for Multi-task Clustering and Transductive Transfer Classification
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
IEEE Transactions on Knowledge and Data Engineering
Online learning for multi-task feature selection
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
SVM multiregression for nonlinear channel estimation in multiple-input multiple-output systems
IEEE Transactions on Signal Processing
Bayesian Multitask Classification With Gaussian Process Priors
IEEE Transactions on Neural Networks - Part 1
Bayesian Spatiotemporal Multitask Learning for Radar HRRP Target Recognition
IEEE Transactions on Signal Processing
Multitask multiclass support vector machines: Model and experiments
Pattern Recognition
Multitask twin support vector machines
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.01 |
Multi-task learning technologies have been developed to be an effective way to improve the generalization performance by training multiple related tasks simultaneously. The determination of the relatedness between tasks is usually the key to the formulation of a multi-task learning method. In this paper, we make the assumption that when tasks are related to each other, usually their models are close enough, that is, their models or their model parameters are close to a certain mean function. Following this task relatedness assumption, two multi-task learning formulations based on one-class support vector machines (one-class SVM) are presented. With the help of new kernel design, both multi-task learning methods can be solved by the optimization program of a single one-class SVM. Experiments conducted on both low-dimensional nonlinear toy dataset and high-dimensional textured images show that our approaches lead to very encouraging results.