Improving Generalization with Active Learning
Machine Learning - Special issue on structured connectionist systems
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Category learning through multimodality sensing
Neural Computation
Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Enhancing Supervised Learning with Unlabeled Data
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Selective Sampling with Redundant Views
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Support vector machine active learning with applications to text classification
The Journal of Machine Learning Research
Diverse ensembles for active learning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Enhancing relevance feedback in image retrieval using unlabeled data
ACM Transactions on Information Systems (TOIS)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Active learning with multiple views
Journal of Artificial Intelligence Research
Semi-supervised regression with co-training
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
COLT'07 Proceedings of the 20th annual conference on Learning theory
Analysis of perceptron-based active learning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Efficient Learning from Few Labeled Examples
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Experimental comparison of semi-supervised learning method based on kernels strategy
CCDC'09 Proceedings of the 21st annual international conference on Chinese control and decision conference
Improving co-training with agreement-based sampling
RSCTC'10 Proceedings of the 7th international conference on Rough sets and current trends in computing
Active learning in multimedia annotation and retrieval: A survey
ACM Transactions on Intelligent Systems and Technology (TIST)
ShareBoost: boosting for multi-view learning with performance guarantees
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part II
Granulation-based symbolic representation of time series and semi-supervised classification
Computers & Mathematics with Applications
Sample-based software defect prediction with active and semi-supervised learning
Automated Software Engineering
Unlabeled data and multiple views
PSL'11 Proceedings of the First IAPR TC3 conference on Partially Supervised Learning
Information Sciences: an International Journal
ACM Transactions on Intelligent Systems and Technology (TIST) - Special Section on Intelligent Mobile Knowledge Discovery and Management Systems and Special Issue on Social Web Mining
Effective balancing error and user effort in interactive handwriting recognition
Pattern Recognition Letters
Improving multi-view semi-supervised learning with agreement-based sampling
Intelligent Data Analysis - Combined Learning Methods and Mining Complex Data
Hi-index | 0.01 |
Multi-view learning has become a hot topic during the past few years. In this paper, we first characterize the sample complexity of multi-view active learning. Under the α-expansion assumption, we get an exponential improvement in the sample complexity from usual Õ(1/ε) to Õ(log 1/ε), requiring neither strong assumption on data distribution such as the data is distributed uniformly over the unit sphere in Rd nor strong assumption on hypothesis class such as linear separators through the origin. We also give an upper bound of the error rate when the α-expansion assumption does not hold. Then, we analyze the combination of multi-view active learning and semi-supervised learning and get a further improvement in the sample complexity. Finally, we study the empirical behavior of the two paradigms, which verifies that the combination of multi-view active learning and semi-supervised learning is efficient.