Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Semi-supervised support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Information Processing and Management: an International Journal
ICML '04 Proceedings of the twenty-first international conference on Machine learning
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
Ensemble of SVMs for improving brain computer interface p300 speller performances
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Semi-supervised learning based on nearest neighbor rule and cut edges
Knowledge-Based Systems
Review article: Human scalp EEG processing: Various soft computing approaches
Applied Soft Computing
Minimizing calibration time for brain reading
DAGM'11 Proceedings of the 33rd international conference on Pattern recognition
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Co-training on multi-view unlabelled data
Proceedings of the 27th Conference on Image and Vision Computing New Zealand
Combining active learning and semi-supervised learning to construct SVM classifier
Knowledge-Based Systems
Aggregation pheromone metaphor for semi-supervised classification
Pattern Recognition
Pattern classification and clustering: A review of partially supervised learning approaches
Pattern Recognition Letters
Hi-index | 0.10 |
In this paper, we first present a self-training semi-supervised support vector machine (SVM) algorithm and its corresponding model selection method, which are designed to train a classifier with small training data. Next, we prove the convergence of this algorithm. Two examples are presented to demonstrate the validity of our algorithm with model selection. Finally, we apply our algorithm to a data set collected from a P300-based brain computer interface (BCI) speller. This algorithm is shown to be able to significantly reduce training effort of the P300-based BCI speller.