Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Stochastic Resonance Neural Network and Its Performance
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 2 - Volume 2
Introduction to Machine Learning (Adaptive Computation and Machine Learning)
Introduction to Machine Learning (Adaptive Computation and Machine Learning)
Random matrices in data analysis
PKDD '04 Proceedings of the 8th European Conference on Principles and Practice of Knowledge Discovery in Databases
Label propagation through linear neighborhoods
ICML '06 Proceedings of the 23rd international conference on Machine learning
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
Semi-Supervised Learning
Graph based semi-supervised learning with sharper edges
ECML'06 Proceedings of the 17th European conference on Machine Learning
Combining smooth graphs with semi-supervised classification
PAKDD'06 Proceedings of the 10th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining
Hi-index | 0.00 |
Most graph-based semi-supervised learning methods model the structure of a dataset as a single k-NN graph. Although graph construction is an important task, many existing graph-based methods build a graph from a dataset directly and naively. While the resulting k-NN graph provides relatively a good representation of the dataset,it generally produces inappropriate shortcuts on cluster boundaries. In this paper, we propose a novel approach for modeling and combining multiple graphs with different edge weights to avoid such undesirable behavior. Using the combination of those graphs, we can systematically reduce the effect of noise in conceptually similar fashion to an ensemble approach. Experimental results demonstrate that our approach improves classification accuracy on both benchmark and artificial datasets.