Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization
Machine Learning
The Knowledge Engineering Review
Semi-supervised learning with graphs
Semi-supervised learning with graphs
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Graph sharpening plus graph integration
Bioinformatics
Graph based semi-supervised learning with sharper edges
ECML'06 Proceedings of the 17th European conference on Machine Learning
Hi-index | 0.00 |
The generalization ability of a machine learning algorithm varies on the specified values to the model-hyperparameters and the degree of noise in the learning dataset. If the dataset has a sufficient amount of labeled data points, the optimal value for the hyperparameter can be found via validation by using a subset of the given dataset. However, for semi-supervised learning---one of the most recent learning algorithms---this is not as available as in conventional supervised learning. In semi-supervised learning, it is assumed that the dataset is given with only a few labeled data points. Therefore, holding out some of labeled data points for validation is not easy. The lack of labeled data points, furthermore, makes it difficult to estimate the degree of noise in the dataset. To circumvent the addressed difficulties, we propose to employ ensemble learning and graph sharpening. The former replaces the hyperparameter selection procedure to an ensemble network of the committee members trained with various values of hyperparameter. The latter, on the other hand, improves the performance of algorithms by removing unhelpful information flow by noise. The experimental results present that the proposed method can improve performance on a publicly available bench-marking problems.