Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Enhancing Supervised Learning with Unlabeled Data
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Combining clustering and co-training to enhance text classification using unlabelled data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Tri-Training: Exploiting Unlabeled Data Using Three Classifiers
IEEE Transactions on Knowledge and Data Engineering
A fast learning algorithm for deep belief nets
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Two-view feature generation model for semi-supervised learning
Proceedings of the 24th international conference on Machine learning
Spectral clustering and transductive learning with multiple views
Proceedings of the 24th international conference on Machine learning
Semisupervised Regression with Cotraining-Style Algorithms
IEEE Transactions on Knowledge and Data Engineering
An RKHS for multi-view learning and manifold co-regularization
Proceedings of the 25th international conference on Machine learning
Deep learning via semi-supervised embedding
Proceedings of the 25th international conference on Machine learning
Analyzing Co-training Style Algorithms
ECML '07 Proceedings of the 18th European conference on Machine Learning
Active learning with multiple views
Journal of Artificial Intelligence Research
Semi-supervised Gaussian process classifiers
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Model Selection: Beyond the Bayesian/Frequentist Divide
The Journal of Machine Learning Research
Why Does Unsupervised Pre-training Help Deep Learning?
The Journal of Machine Learning Research
Missing data problems in machine learning
Missing data problems in machine learning
Restricted deep belief networks for multi-view learning
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
The Journal of Machine Learning Research
Multi-view discriminative sequential learning
ECML'05 Proceedings of the 16th European conference on Machine Learning
On deep generative models with applications to recognition
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Hi-index | 0.00 |
Multi-view semi-supervised learning methods try to exploit the combination of multiple views along with large amounts of unlabeled data in order to learn better predictive functions when limited labeled data is available. However, lack of complete view data limits the applicability of multi-view semi-supervised learning to real world data. Commonly, one data view is readily and cheaply available, but additionally views may be costly or only available in some cases. This work aims to make multi-view semi-supervised learning approaches more applicable to real world data specifically by addressing the issue of missing views. We introduce CoNet, a feature generation method that learns a mapping from one view to another that is specifically designed to produce features that are useful for multi-view semi-supervised learning algorithms. The mapping is then used to fill in views as pre-processing. Our comprehensive experimental study demonstrates the utility of our method as compared to the state-of-the-art multi-view semi-supervised learning methods for this scenario of partially observed views.