Semi-supervised action recognition in video via Labeled Kernel Sparse Coding and sparse L1 graph
Pattern Recognition Letters
Exploiting unlabeled data to enhance ensemble diversity
Data Mining and Knowledge Discovery
A second order cone programming approach for semi-supervised learning
Pattern Recognition
Boosting for multiclass semi-supervised learning
Pattern Recognition Letters
Semi-supervised change detection using modified self-organizing feature map neural network
Applied Soft Computing
Semi-supervised learning via sparse model
Neurocomputing
Hi-index | 0.16 |
Semi-supervised learning concerns the problem of learning in the presence of labeled and unlabeled data. Several boosting algorithms have been extended to semi-supervised learning with various strategies. To our knowledge, however, none of them takes all three semi-supervised assumptions, i.e., smoothness, cluster, and manifold assumptions, together into account during boosting learning. In this paper, we propose a novel cost functional consisting of the margin cost on labeled data and the regularization penalty on unlabeled data based on three fundamental semi-supervised assumptions. Thus, minimizing our proposed cost functional with a greedy yet stagewise functional optimization procedure leads to a generic boosting framework for semi-supervised learning. Extensive experiments demonstrate that our algorithm yields favorite results for benchmark and real-world classification tasks in comparison to state-of-the-art semi-supervised learning algorithms, including newly developed boosting algorithms. Finally, we discuss relevant issues and relate our algorithm to the previous work.